Data Uploading in Production system

In Production system
In OX09, for a Plant/Sloc combination, In the Addresses of storage locations ,we need to update some text(For example- test) in the SORT1 field (Search Term 1/2 Field) ,
This we need to update for about 50-60 Sloc's in Production system,Kindly suggest what is the best way to achieve this without affecting other customization settings
The SORT1 Field is in ADRC Table, We need to link this by passing the Plant in TWLAD table,get the Sloc and Address number field  and Pass the address number field in ADRC Table and check the SORT1 field.
Regards
Amuthan M
Edited by: Amuthan M on Nov 18, 2008 10:45 AM

Since its a customization data updation , normally for transactional data or any material master extension we use LSMW, in this case is LSMW is the best way to do that through Flat file
Any more inputs Please,We need to finalise the approach ASAP
Regards
Amuthan M

Similar Messages

  • DATA UPLOADING IN SAP SYSTEM

    What is the meaning of data uploading in SAP system?Which TCode is used and how it is done?

    Data Uploading : This is uploadig the master datas like material master,BOM,Routing etc from a legacy system or new created in uploadale format.
    Then using the uploading tools like SCAT,LSMW,BDC ETC the datas would be uploaded.
    When you first set up SAP landscape the system will not have any datas and configuration it will be a plain box.
    So configuration has to be done by the consultants and later all the master datas,static and dyanamic datas etc has to be uploaded to start working on SAP

  • Looking for SAP solution to extract data from the production system

    Hello,
    we are looking for the SAP standard tool to extract data from the production system and copy it to the development system. It should be as functional as some kind of partial client copy. The data, which we need to replicate is master and transactional data from FI, CO, AA MM and nice to have FS-RI modules. There is a product offered by the IntelliCorp called "Data Management" but we would like to know if SAP offers its own product.
    Kind regards,
    Krzysztof Murkowski

    Hi,
    SAP offers a solution called 'Test Data migration Server' or TDMS.
    You can access the master guide from the below link:
    https://service.sap.com/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700006332942006E
    Cheers !!
    Satya.
    PS: Pls reward points if the answer was helpful...Thx.

  • Not able to change the data of test data containers in production system

    Dear All,
    We have created eCATT scripts in Development SolMan System and moved the transports to Production SolMan System.  Customer wants to change the data at Test data containers and run the scripts in production system but we are not able to edit the data. 
    May be the reason is SCC4 transaction code has set the below option.
    Changes and transports for client-specific Objects
    u2022 No changes allowed
    Customer doesnu2019t want to change the above option and wanted to change the test data containers to give different datau2019s and run the eCATT scripts.
    Could you please let me know the solution for this?
    Your help is really appreciated.
    Thanks,
    Mahendra

    eCatt has the feature where you don't need to transport the scripts or test configuration to our target system. We can keep all our scripts and test data in Solman and run this script any other system in your landscape using the System data container and target system.
    Maintain the production as one of the target system in System container in Solman and point that system while running the script. Change the test data in Solman to run this script.
    Let me know if you need more information
    thanks
    Venkat

  • Master data upload into SAP system

    Hello,
    I want to know if there is any standard method to upload material master, customer master, vendor master and finance master data into SAP system.
    I am not referring to LSMW's, BDC's and using BAPI's. I am aware of standard programs like RMDATIND for material master upload, RFBIDE00 for customer master upload and RFBIKR00 for vendor master upload. But these are using direct input method and SAP recommends this only for testing purpose. I am not sure if this could be really used in actual live scenarios.
    From some other posts in the forum, I came to know about some transactions like BDLR, SXDB and BMVO. Can some one tell me how to use these T.Codes?
    If some one has any detailed documentation on these T.codes or in general standard master data upload techniques please send it to [email protected]
    Thanks in advance,
    CMV

    Hi,
    Define the following attributes, using the F4 input help and F1 field help:
    Report
    Name of a registered program for this program type
    Variant
    You can only specify a variant with programs that are started directly.
    With direct input, data from the data transfer file undergoes the same checks as with the online transaction and is then transferred directly into the SAP System. The database is updated directly with the transferred data.
    For the documentation of other transactions please refer the correponding program documentation..which is more helpful,
    <b>Reward points if helpful,</b>
    Regards,
    jinesh

  • BCS - master data change in productive system?

    Hi group,
    We are about to go live with our BCS system, but of course someone got the great idea of changing the company code currency for one of the "non-SAP" companies in the group since the sytem was set up.
    When I enter the consolidatin workbench in our production system, everything is considered customizing, so no changes are allowed - even to master data. I''ve searched the "Object changeability" settings of the BW system but couldn't find any settings regarding BCS. I also checked settings in se06 and found a namespace "/FINB/" which I suspect has something to do with BCS (currently set to "Not modifiable")?
    What are the correct system settings for the BCS system to function correctly, and what is the normal procedure for maintaining master data in SEM-BCS? - by transport requests or by changing directly in the productive system?
    Best regards
    Thomas
    P.S.: Are there any usefull resources for BCS on the Internet? The help in the system is very limited and I can't find anything on the SAP marketplace.
    Message was edited by:
            Thomas Ringe Pontoppidan

    Hi Dan,
    Example: you have a single selection containing a set (not range) of separate values. Then you decided to change the set (adding one more value, for example). Are you saying that this single selection change in DEV will be reflected automatically in PROD? 
    From my experience I know that while using single selections, the system replaces it by the set/range/node. And when a single selection was changed, nothing yet happened with elimination settings using SS. You have to go into each part of the settings in the method and refresh the SS. You'll see that only after that the set of values was changed. It's obvious for me that if I don't do this refreshing in DEV and then import the request into PROD, I'll have the old, not modified, version of my settings.

  • EIC- Data Inconsistencies in Productive System

    Dear EIC Experts,
    Have a query regarding data inconsistencies in the productive system. I will illustrate with an example,
    An Activity was created and a follow up activity created on that and assigned to a resolver group.
    There is an entry in SCMG_T_CASE_ATTR with Status 20 (Allocated)
    and an entry in THREIC_ACTIVITY, with a whole load of data like creator org unit and owner org unit missing, all it has are the activity categories and the user id.
    An entry exists in THREIC_FOLLOWUP with all relevant fields filled but Worlflow ID is blank. The status in this table is 40 (delivered).
    So it looks as if no workflow was created and it did not go to anybody to action. And the activity cannot be closed and is just lying there, as the owner cannot take any action till the follow up status becomes completed.
    1) What could be the possible reasons for this to happen? I have already checked that workflows are all activated and there are other activities being raised and clsoed just fine.
    2) Is there anyway I can use any utility tools to correct the data in the back end in such a way that thia activity can be clsoed down?
    I have a few hundred activities like these (from 30000 overall activities) which cannot be closed because of various data inconsistencies. Any ideas are appreciated.

    Hi Harish
    I have worked on multiple EIC implementations and I have never had this issue though the follow up functionality has been set up correctly and by the details in your email my guess is that it is not at your location.
    It is possible to fix these 100 or so and set them to closed but it is to much to type out in this forum.
    I would check out www.eicexperts.com as they are experts in EIC and they may walk you through it as they are very helpful.

  • Need help with enhanced data source in Production system

    Hello Gurus,
    1.                  I enhanced a datasource in BW and populated the field using customer exit using CMOD function. In Dev system, i dont have much data, so I deleted the whole data and did full load.
    what shud I do in Production side, so that Delta wudnt be affected??since in production, we have millions of records, we wont do full load., what is the best way to populate the field in production after transporting the datasource to production without disturbing delta's, to reflect the new field for previous years data???
    2.  can we put 0customer and 0material in the same dimension?? how its going to affect the performance?
    Thanks in advance.,
    Best Regards,
    Pavan

    Hi,
    Please see this
    1.
    see this thread
    populated the new field with historic data
    2. can we put 0customer and 0material in the same dimension?? how its going to affect the performance?
    Its better not to use them in a single dimension  because one customer and take more than one material so if you have 100 customer and 1000 materials  this combination will generate a large number of records. Its always better to keep characteristic which are having 1:N relation ship in one dimensional in you  case customer and material will have an M:N type of relationship.which will result in slow performance.
    Regards,
    Ravi

  • Data upload from Mainframe system

    Hi Champs,
    I need to extract data from Mainframe sys and upload in BW. Can anyone help with the steps ?
    Thanks in advance.
    Abhishek

    With or without using a ETL tool?
    If not using an ETL tool
    Does the mainframe have a spurbox or a logical destination that has an IP address??
    If so - mainframe produces flat file in fixed format - then onto IP address
    EBCDIC to ASCII conversion then use a fixed format transfer structure up into BW

  • BW Archiving run Locks data upload

    Hello BW Experts.
    We are working with Archiving, but we have one problem, when we do data load to InfoProvider, data Load  is failing due to Archiving run locks.
    If we run Delete archiving step, then we can do data upload to InfoProvider without any issues.
    But we are not interested to perform Delete Step immediately; we want to do after few days. But this causes big issue for data loading in production system.
    Is there any program to unlock Archiving session?
    Is there any other process to unlock the tables?
    I know that we can unlock archiving session thru table RSARCHREQ, But I feel this is not right way to do it.
    Any help regarding this is appreciated.
    Thanks and Regards
    Venkat

    Hi Harsh,
    The Archiving process has two steps, the archiving step that writes the archive file and the second step that deletes the data. The Data Target is locked until the second step is completed.
    If you do not wish to delete the data and release the locks, then change the archiving process status to invalidate the archiving run. (transaction SARA).
    I hope this helps,
    Mike.

  • Infosource invisible in production system

    Hello,
    We are facing a unique problem that we are not able to find the Info source for a ODS and one Infocube in our Production system. In Quality system it was working fine and we could see these Infosources. But on transporting the complete devlopment to Production system , we could not locate the required Infosources.
    As a matter of fact,these Infosources was available in the data flow for the Cube,
    and we could go to the infosource by double clicking it in the dataflow. But the same was not visible in the Infosource tab in AWB in production system.
    We tried various Options like:
    1 RSA1 -> Infosource -> from MENU settings -> display generated objects -> and check the radio button show generated objects. Still the infosource was not visible.
    Can anyone please provide some inputs to it.
    Thanks,
    Vipul

    Hi Anil,
    Yes, you are right; my Application component had not been transported to Production System.
    My Infoareas are lying in unassigned nodes and so the info-objects.
    But i dont able to understand why my application componet is not been transported to Production system. When it succesfully transported to Quality system where all things are working fine.
    I have transported the same request to production system as well. Can you provide some insight to it.
    And to other replies i can say that i cant replicate and activate Data Sources in Production system. I have already done these steps in Development system.
    Thanks and regards.
    Vipul

  • Info package issue in Production system?

    Hi experts,
    I would like to know about the scenerio of info packages in production system.
    1) When the objects are transported to production system,
    Do we use the same info packages in the prod system,which are used to schedule the load in development system.
    2) Do we create new infopackages everytime for the data load in production system?. Please let me know about this.
    3) In my case when I schedule the infopackage which are transported to prod system are asking for request number. Is it normal?. I mean everytime we schedule, will it ask for request number in prod system.?
    Please let me know the normal procedure as to how to load the data in prod system(Standard Procedure Which is followed by all developer's to load the data)
    With regards,
    Dubbu

    1) When the objects are transported to production system,
    Do we use the same info packages in the prod system,which are used to schedule the load in development system.
    Ans : The choice is yours.In our system we create infopackages directly in production.Its just flexibility for us.
    2) Do we create new infopackages everytime for the data load in production system?. Please let me know about this.
    Ans : Infopackages are reusuable but if they are used in process chains its better to create new ones as it will affect the associated process chain.
    3) In my case when I schedule the infopackage which are transported to prod system are asking for request number. Is it normal?. I mean everytime we schedule, will it ask for request number in prod system.?
    Ans : Yes its normal to ask request number but in development system.

  • BI Production System with R3 SYSTEM

    Hello,
    We are Installing the BI Production system we have to connect the R3 system or not?

    Hi
                 U have to definitely connect to R/3 system because if not from where u will load data to ur production system
       With Remote function call u have to connect these two systems
    refer to link below
    http://help.sap.com/saphelp_nw04/helpdata/en/fa/731a403233dd5fe10000000a155106/frameset.htm

  • Issue in Hierarchy data upload from R/3 for info object Product Hierarchy.

    Hi,
    I am trying to upload the hierarchy data from R/3 system for Info Object Product Hierarchy.
    Insted of business content info objects (0PRODH, 0PRODH1, 0PRODH2, 0PRODH3, 0PRODH4, 0PRODH5, 0PRODH6), we are using customized objects (ZPRODH, ZPRODH1, ZPRODH2, ZPRODH3, ZPRODH4, ZPRODH5, ZPRODH6).
    In transfer rules the mapped is as specified below
    Fields        =>  Info Objects.
    0ZPRODH1 => ZPRODH1
    0ZPRODH2 => ZPRODH2
    0ZPRODH3 => ZPRODH3
    0ZPRODH4 => ZPRODH4
    0ZPRODH5 => ZPRODH5
    0ZPRODH6 => ZPRODH6
    Now, when I schedule the Info Package, it is ending with an errors
    "Node characteristic 0PRODH1 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH2 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH3 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH4 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH5 is not entered as hierarchy charactersitic for ZPRODH"
    "Node characteristic 0PRODH6 is not entered as hierarchy charactersitic for ZPRODH".
    when i tried to load by flat file, there is no issues. But, flat file loading is not allowed for us.
    Please let me know any possible solution to handle this issue.
    Thanks.
    Regards,
    Patil.

    Hi voodi,
    Insted of using the info object 0PRODH1, we should use customized info object ZPRODH1 which I added already in the external characteristic in Hierarchy.
    Regards,
    Patil.

  • LSMW UPLOAD IT 0655 in Production System

    Hi,
    Firiends while uploading data in IT 0655 in Production System, for few personnel numbers I am getting this error
    *Missing secondary record for infotype 0002 for particular date for particular PERNR
    But i can see in the system that this record is existing.

    If you check the MP065500 screen 2000 there is a validation for infotype 2 while creating the 0655 record.
      if psyst-first eq yes and cprel-endda lt high_date.
    *   I105(RP): Gültigkeitsende muß gleich & sein
        p0002-endda = high_date.                               "VLDN212840_2
        cprel-endda = high_date.                               "VLDN212840_2
        message e105(rp) with high_date.
      endif.
      if psyst-ioper eq modify.
        if cprel-endda lt high_date and pskey-endda eq high_date.
          p0002-endda = high_date.                             "VLDN212840_2
          cprel-endda = high_date.                             "VLDN212840_2
          message e105(rp) with high_date.
        endif.
      endif.
    From what i understand the end date of your 0655 record will have the be equal to infotype 2. Not sure why this is done this way but still it's "hard-coded" so it isn't a customization thing.

Maybe you are looking for