Issues in mass data Upload

Hi All,
        Hope you all are doing fine.
        I have to do master data upload for my next project. I have gone through LSMW, DX-WB, Recording etc. and now I am quite comfortable with all of these.
        As i have tested these tools for maximum 8-10 records, I am interested in hearing from you all, your experiences regarding actual data upload where volume is high and data may be difficult to verify manualy.  Particularly I am interested in how to make upload faster,Error free,consistent(No record being posted twice etc.)
        Your inputs would be higlhly appreciated.
        Thanks a lot for patient reading.
Bye and Regards.

Hi navdeep,
the mass data upload depending on tht data u want to upload, there ara several function module to upload data like create reservation or upload long text inspection method tell me what the data u want to upload
Best Regard
Waleed Sadat

Similar Messages

  • Mass Data upload in SAP from 3rd party system

    Hi Experts.
    Can anyone help me how to do mass data upload in SAP. Actually, when any new joining is done, a form is being filled by employee(joining form), and that data is finally updated in SAP manually using various infotypes. Now, i m planning to make that form available in webpage. The employee will go to the webpage, fill the data , also the HR will fill the required fields, and once the form is complete, the data will get updated in SAP, in resp. infotypes. Like personal details in infotye 2, address in infotype 6, bank details in 9 and so on, in a single shot. Is there any BAPI or something like that, using which this can be achieved.
    Thnx
    S Kumar

    You can try BAPI_BANK_CREATE for IT0009, BAPI_ADDRESSEMP_CREATE for IT0006 and BAPI_PERSDATA_CREATE for IT0002. Otherwise, you can also use FM HR_MAINTAIN_MASTERDATA to create any infotype.
    Have a look also at the Life and Work Events functionality in SAP Portal (http://help.sap.com/erp2005_ehp_04/helpdata/EN/f6/263359f8c14ef98384ae7a2becd156/frameset.htm)

  • Issue in Hierarchy data upload from R/3 for info object Product Hierarchy.

    Hi,
    I am trying to upload the hierarchy data from R/3 system for Info Object Product Hierarchy.
    Insted of business content info objects (0PRODH, 0PRODH1, 0PRODH2, 0PRODH3, 0PRODH4, 0PRODH5, 0PRODH6), we are using customized objects (ZPRODH, ZPRODH1, ZPRODH2, ZPRODH3, ZPRODH4, ZPRODH5, ZPRODH6).
    In transfer rules the mapped is as specified below
    Fields        =>  Info Objects.
    0ZPRODH1 => ZPRODH1
    0ZPRODH2 => ZPRODH2
    0ZPRODH3 => ZPRODH3
    0ZPRODH4 => ZPRODH4
    0ZPRODH5 => ZPRODH5
    0ZPRODH6 => ZPRODH6
    Now, when I schedule the Info Package, it is ending with an errors
    "Node characteristic 0PRODH1 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH2 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH3 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH4 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH5 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH6 is not entered as hierarchy charactersitic for ZPRODH".
    when i tried to load by flat file, there is no issues. But, flat file loading is not allowed for us.
    Please let me know any possible solution to handle this issue.
    Thanks.
    Regards,
    Patil.

    Hi voodi,
    Insted of using the info object 0PRODH1, we should use customized info object ZPRODH1 which I added already in the external characteristic in Hierarchy.
    Regards,
    Patil.

  • Mass data upload from LSMW

    hi guys
    i heard from many guys that mass documents can be uploaded through LSMW.
    pls do suggest me, how the path way for documents(files) can be given and where it will be given ...........and where the care has to be taken
    Regards
    surya

    solved,closed

  • Mass data Upload

    Hi Guys,
      I have to change the contents of some fields(may be 2 fields) in material master record for around minimum some 7000 materials. For which always we can go for BDC or mass maintenance.
    My question is, when the records are too much, then whether the standard mass maintenance tool is recommended for so many materials. Pls suggest which is the better option (BDC or Mass maintenance ) AND why???
    Regards
    Mahesh

    Hi Mahesh,
                   in case of BDC you need the assistance of ABAP resource to develop the program. if the effort involved for preparing the BDC for these 2 fields is less then go for it.
    not all the fields in mass maintenance are modifiable, first check which field you want to modify. then also it depends if the data is in sequence, then you can give the range then you task will be easy.
    I suggest you to go for BDC as your activity will take less time.
    Regards
    Rajesh

  • 4.1 Data Upload

    Our DBA just installed APEX 4.1, which were were hoping to use to simplify some of our data upload tasks for users.
    On my first attempt at using the Data Upload wizard, I am unable to select the destination table created specifically to receive the data upload. It is not in the drop-down list of the selected Schema.
    Is there some type of grant or other requirement necessary for a table to be shown on the drop-down list? The schema in question only shows four tables in the drop-down, but there are 48 tables in the schema.
    Thanks in advance for your help.

    Hi Korayovich,
    changing the column from varchar2 to number should not be an issue to the data upload wizard. Once you have successfully changed the column, the next time you are trying to upload, the new column type will be picked up.
    Make sure you have successfully changed the column type, and you should be good. Otherwise you can share your data so that i can take a look on what is happening.
    Regards
    Patrick

  • Mass Data Loads into SNP Key figures

    Hi All,
    Does anyone have any knowledge of doing mass data uploads agains key figures (e.g. Safety Stock Planned). There is a transaction /SAPAPO/TSKEYFMAIN - Mass maintenace of Time Series Key figures, but this does not give me the option of loading thousands of materials at one time. Any thoughts would be appreciated.
    Rumi

    Rumi
    As Kaushik has mentioned you can upload data from info cube to planning book for time series key figures and you can read data maintained in excel and upload it to your info cube.
    But i have a question. Why cant you maintain safety stock in APO product master for the material branch and using macro read data from product master and populate safety stock key figure ?  Acutally you can maintain the safety stock in R/3 and as soon as you CIF material to APO the safety stock field in product master will get populated and you can read that using a macro .
    Thanks
    Aparna
    Edited by: Aparna.Ranganathan on Dec 9, 2010 6:17 PM

  • Value Mapping Replication for Mass Data - Performance Issues

    Hi All,
    We are looking into Value Mapping Replication for Mass Data. We have done this for less number of fields.
    Now we might have to have 15,000 records in the cache for the Value Mapping. I am not sure how this would effect the Java Cache and Java Engine as a whole.
    There might be a situation where we will have to leave the 15K records in the cache table on Java Engine...
    Are there any parameters that we can look into just to see how this hits the performance.
    Any links/ guidance in the right direction might help me..
    reg

    Naveen,
    Check jins reply in this thread (they have done with API and without API using graphical but still some issues):
    Value mapping performance using LookUp API
    ---Satish

  • Issues with 4.1 Data Upload

    I've got some issues with the new feature 'data upload' in 4.1. I had already made the pages and the data loading and it worked perfect. Now a new column was added to my data loading table and I have to add a new table lookup for this column. Everything is set up like it has to be. I've got 4 table lookups. When I try to upload some data and indicate the table columns to the data, I've always get the error: Failed to retrieve the lookup value. There is no problem when I do a data load where he only have to retrieve one column from the lookup table, when he has to retrieve data from more tables for more columns, I always get the FAILED message. Does anyone know the problem of this situation? I already tried to make a totally new data loading, but this also failed to do the job.

    Hi Ellenvanhees,
    I dont think the number of lookups that you defined is an issue here. If possible try to share more details about your data and tables. The few things that come to my mind are probably your data.
    But if you are able to do one by one lookup without problem then I think your upload is failing due to null values. The current status of data upload feature returns a failed lookup even if a null value was to be uploaded. This is a bug #13582661 and has been fixed for the future release.
    Patrick

  • Cost center data uploading issue in OKENN transaction

    Hi Gurus,
    I have uploaded cost center data using LSMW IDOC method. the data uploaded in the CSKS table successfully. after that i have checked for the cost centers in OKENN transaction at standard hierarchy level. but there cost center are not exist under hierarchy level. then i have changed the hierarchy data(KHINR)(for example From SUS2021100 to SUS2021200, ihave changed in KS02.... then the cost displyed under SUS2021200.... again i have changed the hierarchy from SUS2021200 to SUS2021100.... then the cost center displyed under SUS2021100..... But actualy when i have loaded the data using LSMW all the loaded.... but the cost center not displyed under hierarchy.... after manualy chnged the hierarchy data at KS02 transaction.... it is displying.... idon't understand what is the problem.... can anybody solve my issue.... thanks in advance

    Hi Gurus,
    I am facing the same issue while uploading cost centers through COSMAS01 basic type.
    Cost centers are created properly, we can display them with KS02. Every data are loaded correctly, even the standard hierarchy. Into table CSKS, field KHINR for standard hierarchy is populated, nevertheless, we do not find the cost center into the OKEON Tcode.
    How could I solve the issue I am facing ?
    Thx for your help.
    Regards,
    Aurelien

  • Mass data load into SAP R/3 - with XI?

    Hi guys!
    I have an issue - mass data migration into SAP R/3. Is XI a good solution? It will be about 60GB of data. Or is there a better way of this data load?
    Thanx a lot!
    Olian

    hi,
    SAP doesn't recomment using XI for mass data migration
    and 60 Gb is certainly too much
    use LSMW for that purpose
    Regards,
    michal

  • BW upgrade EHP1, data uploads should stop?

    Dear experts,
    we have plan for system upgrade. current system is BW 7.0 sp 17, Now planned for EHP1 and SP9.
    I know the there some post upgrade activities, which include consistancy check for objects (infoobject, transfer rules, cubes,dso, etc...)
    Could some one pls confirm, do we need to stop the data uploads/stop process chains during system upgrade?
    Thanks in advance!
    Best Regards,
    Mannu

    Hi Ingo,
    RSRT was giving proper results. We have now implemented few SAP notes and the issues got resolved.
    The following are the notes:
    1499233 - MDX:bXML flattening, unbalanced hierarchy, empty columns
    1485648 - MDX: bXML flattening and hierarchies and displaced columns
    1446245 - MDX: Error when RSR_MDX_BXML_GET_GZIP_DATA is called
    1441767 - MDX: No data for bXML if only ALL member is requested
    1438091 - MDX: basXML: Object MEASURE 0 not found
    1435844 - MDX:Wrong no. decimal places for basXML flattening interface
    1432162 - MDX: Flattening problems when using hierarchies
    1420169 - MDX: bXML flattening: Subsequent note
    1411491 - MDX: bXML flattening in transac. MDXTEST: Selecting packages
    1404328 - MDX: bXML flattening and PROPERTIES: Columns overwritten
    Thanks for your inputs.
    Regards,
    shesha.

  • GRC 10.1 - ARA - Mass Mitigation Upload

    I am having issues in mass upload of mitigating controls for Rule IDs more than 300 ..
    I have tried to do it using GRAC_UPLOAD_MIT_ASSIGNMENTS , but it is giving me error
    Also I am not quite sure on the format of the file to upload .. I am using as below
    System  RISK ID  Rule ID Control ID Valid from Valid To  Monitor  Active Role Name
    Can you forward me any template to upload mitigating controls . When I try to download the existing one , the first column has entry as below , which I am not quite sure ,
    005056885ED71ED3BAF72E92A5659B83

    Can you forward me the template to upload mitigating controls .
    I have tried to download it using GRAC_DOWNLOAD_MIT_ASSIGNMENTS , but the excel file has first column with characters 005056885ED71ED3BAF72E92A5659B83.
    The first row is as below . Not sure about the value as mentioned.
    005056885ED71ED3BAF72E92A5659B83
    F028
    FIN0003
    25.07.2014
    21.01.2015
    PRASADA
    X
    YEC_FI_ACC_MGMT_OFFICR_ANZ

  • Photo Gallery display by date uploaded

    I would really like to have the photo gallery display images by date uploaded not alphabetical. I know this isn't possible but I remember somewhere on the old forum someone posting a workaround for this issue. Does any one know a workaround? I'm sure it had something to do with renaming the files.
    Thanks
    Karl

    Hi Sidney, thanks loads for your reply. If i try your method the most recent image still wont be placed at the beginning eg.
    If today's date is 01022012 then tomorrows date will be 02022012 which will still be placed after 01022012
    Eg
    01022012
    02022012
    Thanks again but is there any other method?
    Karl

  • Optimization for bulk data upload

    Hi everyone!
    I've got the following issue:
    I have to do a bulk data upload using JMS deploy in a glassfish 2.1, to process and validate data in a Oracle 10g DB System before it is insert.
    I have my web interface that loads a file and then delegates the process to a Stateless Session Bean which read a N number of lines and after that send a message to a JMS Queue. The JMS has to parse the each line, validate them with the data already in the DB, and finally persist the new data.
    This process is high process consuming, and I need to improve the performance time. I tried to change glassfish default JMS and JDBC pool size, but I have no a huge difference.
    Do you have any advice that could help me?
    Thanks in advance!

    Hi! thank you for you answer!
    High process consuming is in the MDB
    I'm grouping each N number of read lines in the EJB and then send the message to the JMS. The MDB process the persists each line as info in different related tables.
    Thanks again!

Maybe you are looking for

  • My Ipod Touch keeps turning off and on! Help!

    When I turn on my Ipod Touch 4th Generation it turns off by itself then a black screen comes up. Someone please help me this has been happening for quite a long time now.

  • Can't embed Camtasia video

    I have Robohelp 9 and Camtasia Studio 8. I created a short (1 min) video with sound and want to add it to my Robohelp Webhelp project. A click on a link in the topic should open up a separate page and play the video  . It doesn't work.  What I have a

  • Want to use transaction type

    Hi,   Enter another transaction type (Transaction type . does not exist) Message no. AA816 Diagnosis According to your entry or specification, you want to use transaction type .. However, this transaction type has not been defined. Procedure Check th

  • Can we get the code from the project using OER

    Hi, i have some question about the OER (Oracle Enterprise Repository). when we register an artifact in OER (say wsdl file) then is there any way to get all the project code when user download the artifact. currentely i register one wsdl file from my

  • Exception occured while reading the license file.

    Java Exception: java.lang.RuntimeException Exception occurred while reading the license file. java.lang.RuntimeException: Exception occurred while reading the license file. at weblogic.security.utils.SSLSetup.getLicenseLevel(SSLSetup.java:143) at web