Update the on-hand quantities through Data Loader

Dear All,
I need to update the on-hand quantities of some number of Items (5000 items).
I hope I can do this with Data Loader, but which is the screen that I need to go and update the stock. Is the way, or any other alternate way that I can accomplish this stock update..
Please update...
many thanks in advance....

Hi Friend
Whenever you need to uploda the inventory, we use the Create Immediate Form in OA. Which consists of Reason Code/Item/Lot/WH/Qty for this you need to create DATA LOADER script which looks like below
Reason code Tab Item  Tab Lot tab SB WH tab Qty save SB *NR
POST tab raw1 tab L101 tab sb W1 tab 150 save sb NR
ADD tab FG123 tab F101 tab *sb W2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

Similar Messages

  • To Update the IDOC segment values through report program

    My requirement is to update the Idoc segment through the report program. Any SAP provided standard function module is available to update the Idoc segment values. Please help needed.

    DATA: LT_EDIDD TYPE STANDARD TABLE OF EDIDD."Local Table to Hold EDIDD
        LT_EDIDD = I_EDIDD. "table should have the data
    *-- Opening the IDoc for Edit
        CALL FUNCTION 'EDI_DOCUMENT_OPEN_FOR_EDIT'
          EXPORTING
            DOCUMENT_NUMBER               = X_EDIDC-DOCNUM
          TABLES
            IDOC_DATA                     = IT_EDIDD
          EXCEPTIONS
            DOCUMENT_FOREIGN_LOCK         = 1
            DOCUMENT_NOT_EXIST            = 2
            DOCUMENT_NOT_OPEN             = 3
            STATUS_IS_UNABLE_FOR_CHANGING = 4
            OTHERS                        = 5.
        IF SY-SUBRC <> 0.
          MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
              WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        ENDIF.
    *-- Editing the IDoc
        CALL FUNCTION 'EDI_CHANGE_DATA_SEGMENTS'
          TABLES
            IDOC_CHANGED_DATA_RANGE = LT_EDIDD
          EXCEPTIONS
            IDOC_NOT_OPEN           = 1
            DATA_RECORD_NOT_EXIST   = 2
            OTHERS                  = 3.
        IF SY-SUBRC <> 0.
          MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                  WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        ENDIF.
    *-- Closing the IDoc after Edit
        CALL FUNCTION 'EDI_DOCUMENT_CLOSE_EDIT'
          EXPORTING
            DOCUMENT_NUMBER  = X_EDIDC-DOCNUM
            DO_COMMIT        = 'X'
            DO_UPDATE        = 'X'
            WRITE_ALL_STATUS = 'X'
          EXCEPTIONS
            IDOC_NOT_OPEN    = 1
            DB_ERROR         = 2
            OTHERS           = 3.
        IF SY-SUBRC <> 0.
          MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
              WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        ENDIF.
    thanks
    vijay

  • How autoextend affects the performance of a big data load

    I'm doing a bit of reorganization on a datawarehouse, and I need to move almost 5 TB worth of tables, and rebuild their indexes. I'm creating a tablespace por each month, using BIGFILE tablespaces, and assigning to them 600GB, that is the approximate of size of the tables for each month. The process of just assigning the space takes a lot of time, and I decided to try a different approach and change the datafile to AUTOEXTEND ON NEXT 512M, and then run the ALTER TABLE MOVE command to move the tables. The database is Oracle 11g Release 2, and it uses ASM. I was wondering what would be the best approach between these two:
    1. Create the tablespace, with AUTOEXTEND OFF, and assign 600GB to it, and then run the ALTER TABLE MOVE command. The space would be enough for all the tables.
    2. Create the tablespace, with AUTOEXTEND ON, and without assigning more than 1GB, run the ALTER TABLE MOVE command. The diskgroup has enough space for the expected size of the tablespace.
    With the first approach my database is taking 10 minutes approx moving each partition (there's one for each day of the month). Would this number be impacted in a big way if the database has to do an AUTOEXTEND each 512 MB?

    If you measure the performance as the time required to allocate the initial 600 GB data file plus the time to do the load and compare that to allocating a small file and doing the load, letting the data file autoextend, it's unlikely that you'll see a noticable difference. You'll get far more variation just in moving 600 GB around than you'll lose waiting on the data file to extend. If there is a difference, allocating the entire file up front will be slightly more efficient.
    More likely, however, is that you wouldn't count the time required to allocate the initial 600 GB data file since that is something that can be done far in advance. If you don't count that time, then allocating the entire file up front will be much more efficient.
    If you may need less than 600 GB, on the other hand, allocating the entire file at once may waste some space. If that is a concern, it may make sense to compromise and allocate a 500 GB file initially (assuming that is a reasonable lower bound on the size you'll actually need) and let the file extend in 1 GB chunks. That won't be the most efficient approach and you may waste up to a GB of space but that may be a reasonable compromise.
    Justin

  • Update the "remark" field in general data of a BP (using IDOC in LSMW)

    Hi,
    I have to create and udpate BP with comments in the general data. I have no problem to manage this field in creation mode. I only use the segment "E101BAPIAD_REM" from the "CRMXIF_PARTNER_SAVE_M02" Idoc basic type. I fill the following fields: "LANGU, LANGU_ISO, ADR_NOTES".
    However in update mode I don't success to update the value for the field "REMARK". I use the second segment "E101BAPIAD_REX" and i fill only field "ADR_NOTES" with value "X" and "UPDATEFLAG" with value "U".
    I tried several value and several cases but i failed each time. If anybody has already managed this data with IDOC I would be very interesting...
    regards,
    Fabrice Mouyebe.

    Hi Thirumala,
    thank you for your answer but you speak about the field "AddressGUID" of E101BUS_EI_ADDRESSREMARK but I haven't this field at this level. This field is linked to the segment "E101BUS_EI_BUPA_ADDRESS_KEY", isnt'it ?
    Effectively i have certainly a problem with the Address GUID. I have to test it because it doesn't work with only UPDATEFLAG at 'X' and CURRENT_TASK too.
    Thanks,
    fabrice.

  • One computer updated the other says up to date but they are the old version and reloading doesn't work.

    I updated to the latest version on my desktop, then my credit card was hack so it was replaced and I had to reset my credit card info. When I tried to update my laptop the CC App said all were up to date - but trying to load a file from the updated Premiere app from my desktop onto my laptop fails with a "Newer version" error message. I unloaded and reloaded premiere but that did not work.

    Alw51 what specific Newer Version error message are you referring too?  Also which operating system are you using?  Finally what version of Premiere Pro are you downloading and installing?

  • How to update the sales order header & item data in TM system

    Hi Experts,
    Greetings!
    I need your help,I have a one requirement sales order data came from ECC these sales order data need to update in TM Sales order header table as well as item table also these fields are additional fields.
    Can anyone please guide me I am very new in TLM .
    Thanks in advance.
    Thanks&Regards,
    Siva.

    Hi Siva
      "/SCMTMS/TRQ~ROOT" is for sales order header and "/SCMTMS/TRQ~ITEM" is for details.
      I assume you need to
    enhance the structures for these nodes to hold your add. fields;
    and do the same for the input parameter of service TransportationRequestRequest_In (which is used to create OTR) from PI side;
    Pass the add. fields during service call (impelment in ERP system);
    Map the fields from service paremeter to node attribute (implement in TM system, BAdI   /SCMTMS/TRQ_SE_TPNRQ_REQ~CHANGE_MODIFICATION create modification table for the input parameter).
    I cannot find source code for all of that; hope it helps.
    Sensen

  • How to update the price based upon PGI date

    Hi
            I have issue of updation of the Prices and freight based on PGI date in the billing we are using the two billing types for the excsies and tax invoice creation .And in the copy control pricing type is maintained Aas "C" for the billing types with single delivery but someHow MRP in the excise billing has been picked from the condition record thats validity is ended and in Tax invoice it picks up the correct prices
    Both pricing condition types has pricing type "B" from Billing date and in the freight we have maintained as "A" SRD
    But for the some cases specially for the excise related part that is based upon the MRP we are facing this issue
    Pricing date is some how coming from sales document
    Please find the problem details in the attachment

    Hi,
    if you see two condition tabs snap shots you can understand clearly because that two invoices has been created in two different dates and you have maintained the pricing date C-billing date ( KOMK-FKDAT).Due to this,the price of ZMRP is coming differently.After you creation of first invoice then you would have changed ZMRP amount.Now while you are creating second invoice ,system has taken new price of ZMRP in billing level.
    Note:While creating second invoice, PGI date might have come into billing level but someone would be changed billing date manually at header level of billing document.Please check that one also.
    Kindly let me know if you need further help on this.
    Thanks,
    Naren

  • Update rule not working with high data load.

    Hi all,
    i have a problem with a update rule: it's an update loop in a dso, in the start routine i do 3 select for all entries in data-package on the active table of another structure; then i read those table to update some values in my update rule.
    I did some test and it seemed to work well (i tryed for example just for a plant) but when i launched it for all the records in the dso the result was differente and, for the same plant, the values where not updated correctly (they were blank).
    Now the routine is really the same so it sound strange to me that launching the infopackage without filters i can't get the same correct result of my previous test but i was wondering what could be the reason of this error...
    Anyone can help?
    The start routine is this:
      REFRESH i_tab.
      SELECT field1 field2 field3
        INTO TABLE i_tab
        FROM target_dso
        FOR ALL ENTRIES IN DATA_PACKAGE
        WHERE deliv_numb = DATA_PACKAGE-deliv_numb
          AND deliv_item = DATA_PACKAGE-deliv_item
          AND act_gi_dte <> l_blank_date.
    then i read this table in the other routines...

    It is hard to say.  What does the update rule look like?
    after the read statement, you could check the return code.  If it is not zero, go into an infinite loop, and debug it via SM37.
    read table....
    IF sy-subrc <> 0.
      WHILE 1 = 1.
        "debug in SM37.
      ENDWHILE.
    ENDIF.

  • How to update existing table using Data Load from spreadsheet option?

    Hi there,
    I need to update an existing table, but in Data Load application when you select csv file to upload it inserts all data by replacing existing one. How can i change this?
    Let me know,
    Thank you.
    A.B.A.

    And how do you expect your database server to access a local file in your machine ?
    Is the file accessible from outside your machine say inside a webserver folder so that some DB process can poll on the file ?
    Or, is your DB server in the same machine where you have the text file ?
    You will have to figure out the file acess part before automating user interaction or even auto-refreshing.

  • Data load through DTP giving Error while calling up FM RSDRI_INFOPROV_READ

    Hi All
    We are trying to load data in Cube through DTP from DSO. In the Transformation, we are looking up Infocube data through SAP Standard Function Module 'RSDRI_INFOPROV_READ'. The Problem we are facing is that our loads are getting failed & it is giving error as 'Unknown error in SQL Interface' & Parallel process error.
    In the DTP, We have Changed the No. of Parallel processes from 3 (default) to 1 but still the above issue exists with data loads.
    We had similar flow developed in 3.5 (BW 3.5 Way) where we had used this Function Module 'RSDRI_INFOPROV_READ' & there our data loads are going fine.
    We feel there is compatability issue of this FM with BI 7.0 data flows but are not sure. If anybody has any relevant inputs on this or has used this FM with BI 7.0 flow then please let me know.
    Thanks in advance.
    Kind Regards
    Swapnil

    Hello Swapnil.
    Please check note 979660 which mentions this issue ?
    Thanks,
    Walter Oliveira.

  • Trying to add Contact information through Oracle Data Loader

    Hi,
    I have checked Oracle On Demand guide PDF and can able to insert a valid account data on oracle on demand via client batch pgm. Can you tell me any valid Contact Map and contact*.csv file which can insert contact information on Oracle On Demand. If I get dealer , vehicle or any other that would also help. Where I can check the map details for all these record types. That is the biggest problem I am facing.
    Thanks in advance for your help !!!
    JD.

    I am able to inser a basic contact on Oracle On demand through Data loader. But it is getting partially completed with errors. The first Name, Last Name is getting inserted, but columns like Title , Address those are not getting inserted. Can you give me why it is behaving wierdly. The Map I found to be okie.
    Appreciate your reply...
    Thanks...

  • How to Debug the Failed Data loads?

    Hi..All
    Can Any one plz explain How to Debug the different types of Failed Data Loads?
    Thanks & Regards
    Jonn

    Hi Jonn,
             If any data failed ....1st analyze where it occurs according to that what we have to decide...
    Analysis:
    1.Check the Status Tab in RSMO--here we can get error information
    2.Check the Details Tab in RSMO--here we can get whether extraction is completed,and where it is exactly failed i.e while updating from PSA or in Update rules etc
    3.Check the corresponding request job log in SM37.
    4.Check whether any shortdump in ST22.
    5.Check the Request status in the Target...some times activation job failed in SM37 but in the Target we found as activated.
    6.Check BD87 Transaction for any IDOC errors.
    7.Check SM58 for TRFCs
    Hope this helps u.....
    Mohan

  • ODI - How to clear a slice before executing the data load interface

    Hi everyone,
    I am using ODI 10.1.3.6 to load data daily into an ASO cube (version:11.1.2.1). Before loading data for a particular date, I want the region to be cleared in the ASO cube defined by "that date".
    I suppose I need to run a PRE_LOAD_MAXL_SCRIPT that clears the area defined by an MDX function. But I don't know how I can automatically define the region by looking at several coloums in the data source.
    Thanks a lot.

    Hi, thank you for the response.
    I know how to clear a region in ASO database. I wrote a MaxL like the following:
    alter database App.Db clear data in region '{([DAY].[Day_01],[MONTH].[Month_01],[YEAR].[2011])}'
    physical;
    I have 3 seperate dimensions such as DAY, MONTH and YEAR. My question was, I don't know how I can automize the clearing process before each data load for a particular date.
    Can I somehow automatically set the Day, Month, Year information in the MDX function by looking at the day,month,year coloumns in the relational data source. For example if I am loading data for 03.01.2011, I want my MDX function to become {([DAY].[Day_01],[MONTH].[Month_03],[YEAR].[2011])}'. In the data source table I also have seperate coloumns for Day, Month , Year which should make it easier I guess.
    I also thought of using Substitution variables to define the region, but then again the variables need to be set according to the day,month, year coloums in the data source table. I also would like to mention that the data source table is truncated and loaded daily, so there can't be more than one day or one month etc in the table.
    I don't know if I could clearly stated my problem, please let me know if there are any confusing bits.
    Thanks a lot.

  • Spread the Data Loads in a SAP BW System

    Gurus,
    I want to spread the data loads in  our BW system, as a BASIS person how do I identify the jobs if they are full loads or delta loads, our goal is to make the load on the system to be evenly distributed as we see too many data loads starting and running around the same time. Can you suggest a right approach to achieve our goal.
    Thanks in advance
    Siva

    Hello Siva,
    As already mentioned the solution is to include the different steps of the data flow, extraction , ODS activation , rollup etc
    in process chains and schedule these chains to run at differet times so that they do not place too much load on the system.
    If the problem is specific to data loads  on the extraction step then I guess that maybe you see the resource problem on the
    source system side, if you don't have the load distribtion switched on in the RFC connection to the source system it is
    possible that you can specify that the source system extraxction jobs are executed on a particular application server,
    please see the information in the 'Solution' part of the note 147104 and read it carefully.
    Best Regards,
    Des

  • MAXL SCRIPT TO EXECUTE IN BACKGROUND THE DATA LOAD

    Hi,
    I have problem with a MaxL script, I don´t know the command to execute a data load in the background, someone knows??? it would be very grateful if you can help me because now I have to load the data manually and then tick execute in background.
    Thanks for your help
    Regards,

    If the two processes are in no way dependent on each other, why not just use two separate MaxL scripts and run / schedule them separately?
    If you really need to launch multiple MaxL operations against different cubes to run in the background from a single script you can only do this with a shell / command script, not 'natively' in MaxL. If you're on Windows and using CMD, for example, see the 'START' command.
    --EDIT: Crossed over with Sunil, think he pretty much covers it!                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Maybe you are looking for

  • Issue on Service Ports for outgoing connection

    Hi, My question is regarding to my desktop Mac making outgoing connection to an external IP address 184.84.124.244 using TCP protocol destination port 443 but using 40 Service Ports between 49170 through 49217.  This is an automatic outgoing connecti

  • JDev10g: Using a Tree component based on different queries.

    Hello, I've been doing some research on how to implement a tree component in ADF which I got working. However I need to implement a tree component that uses several different queries. The parent node would be one query, the child/leaf nodes would be

  • HTTPService in AIR application

    I have to cross post it, because I need that to get solved. Why same HTTPService code that works in Flex does not work at all in AIR? All I am getting is HTTP service error.

  • Email program shuts down when I click on icon

    SSuddenly my iPad won't let me into my email. When I click on the email icon, I get a screen that looks as if it will pull up my emails, but then it goes dark and returns to home page.  I deleted both email accounts, shut down the system, started up

  • Appleworks won't open at all! help!

    Yesterday I was working on a word document and all of the sudden, I got the little spinning disk. I did a force quit and restarted my computer. Then, I go to open Appleworks and it doesn't do anything. I click the icon in the dock and it bounces up o