Inventory Loading:complex situation

hi xperts,
I am handling the Inventory Management Scenairo:
1.I already loaded the 2LIS-03_BX,into bw.
2.For 2lis_03_BF,Since the System has got a lot of load i split the load into Two parts:
  01.01.2006 to 01.07.2006  and 02.07.2006 to 01.04.2007  and i also already initialised the Data,oli1bw for 01.01.2006 to 01.07.2006. in the R/3 Source System.
3.Now i want to load data into BW,How Should i Proceed With that?Should i go with full update or DELTA init?.
What about the MARKER UPDATE?should i tick or untick the check box marker update.
4.And without deleting i.e LBWG,I am unable load the Data for 02.07.2006 to 01.04.2007.So what is the significance of LBWG?.
5.what are the next steps.
Can any body help me to solve this complex situation.

hi,
  thanks for kind and informative reply.
According to answers given by you.
1.Its better to go for INIT DELTA for period 01.01.2006 to 01.07.2007,after that
what ever loads i am doing should be an DELTA LOADS.Is it True?
When i load the Data as delta I need to  untick the MARKER UPDATE?Is it right?
2.In the source system side in RSA7 will be full with more 100000 records is that going to give performance problems?
can you plz give me reply.

Similar Messages

  • Inventory load at WM level

    Hi All,
    We have an issue with inventory load at my present client, here is the background on it.
    Client has two legacy systems;
    First SYSTEM holds information on where product is located in the warehouse (bin or rack locations) along with quantities, and
    Second systems holds information on bill of entry number (which is assigned by "customs" when product is imported from outside country vendors, which is 100% case) along with quantity.
    It is possible to have 2 bill of entry numbers for 100 pcs for a material in system #2, and it is in 4 different locations in system #1 or vice versa. We need to capture bill of entry number in SAP, which we have planned to capture in batches.
    We are struggling in devising a plan to load inventory at plant/WM level (mainly at WM level).
    Our approach (which is still not clear):
    1. Load stock at IM level based for n number of entries (where n is equal to Bill of entry available against each material) which can be determined based on System #2 extract. Example: System #2 says 5 bill of entries available for 1000 pcs of XYZ (100 + 300 + 200 + 200 + 200), then load inventory using MB1C for 5 lines with before mentioned quantities. External batches will be used for load and will be updated later with correct batch characterstic - BILL OF ENTRY.
    2. Now how do we move stock at WM level. I know transfer requirements will be created but I have two spreadsheets, one with location information along with quantity and other with bill of entry information along with quantity. Again like I mentioned before, number of bill of entries in system in most cases will be different than number of location this material is available. EXAMPLE: 5 bill of entries for 1000 pcs from above example can be in 10 different locations, 50 pcs in location #1, 150 in location #2, and so on.
    We don't have direct link between bill of entry number and location, which is making all this load more difficult.
    We are thinking of having a program written or a macro in VB to do complex lookup in multiple tables to determine which bill of entry will be assigned to which location. But I will appreciate if anyone can assist me or give me an idea which is simpler than this approach.

    Hi Naren,
    u201CWe don't have direct link between bill of entry number and location, which is making all this load more difficult.u201C
    If you do not know the exact connection between u201Cbill of entriesu201D and location (in WH) in your legacy system(s), how do you want to solve this problem in SAP???
    First, you should create one table with correct information: material, batch, location. If you cannot determine it other way, you should make a physical inventory (stock count, just physically) and decide which material in which location which u201Cbill of entriesu201D (~batch) has. If you cannot get this information thereu2019s no point in speaking about this problem in my opinion u2013 you should have the proper input to get the proper output.
    After that you can easily upload your data, even directly into WM storage bin w/o using any TR & TO (you can give the WM data directly for IM mvt 561 u2013 you can allow this in OMJJ). Or you can use u201Ctraditionalu201D stock upload by 561 in IM and after that you can create TOs to upload to stock into the bins.
    Regards,
    Csaba

  • Inventory load using 561 movt type and auto confirm TO in MB1C using 561 movt type

    Hello All,
    I am doing the inventory load using 561 movt type in MB1C. Then the material is in 998 storage type and bin "AUFNAHME". my requirement is that i want to confirm the TO, when i do the MB1C itself for the 561 movt type. Is there a way in the SAP, where i can trigger the TO confirmation in the background automatically, when i use 561 movt type in MB1C.
    Right now my thought processing is that as below,
    i create the inventory load using 561 movt type in MB1C and use the LT10 and move the stock to a different bin and confirm the TO.
    In the above step, i am using LT10 as a extra step for confirming the TO, which i am trying to decreasing my configuring in such as way for the movt type 561, so that it will automatically confirm the TO in the background, while doing the MB1C.
    My 2nd question is, when i have 5000 different materials, which are inventory loaded using 561 movt type and MB1C. I am trying to auto confirm the TO in the background, when doing the MB1C for 561 movt type itself.Secondly, if it doesn't work, then i can goto LL01 and pick the unconfirmed TOs and mass confirm the TO's.
    On the whole, my questions are is there any std config in SAP to auto confirm the TO for movt type 561 in MB1C. If it is not possible, then i need the best way to mass confirm the TO's in  SAP.
    Thnks,
    Kapil

    Hello Kapil,
    Let me explain the process step by step
    Assumption/mandatory criteria-  The process is applicable only in case you do not have storage unit management active  This process is warehouse transfer w/o WM transaction. This method is based on principle of MB1A- 711/712 movement type.
    (1) Use OMJJ and create new movement type by coping 561 for ex- "Z61". In field selection (flield selection from 201) double click on WM and make the entry "WM storage bin and storage type as required entry from suppress.
    (2) Now use MB1C with above customised movement type. At the end of this transaction,system will take you to the screen to  make an entry in storage type and storage bin. Post the entry.
    That's it. Your stock would be in warehouse as well as IM stock would be in sync. There is no TR/TO. You can create LSMW to upload the data by this method. If you have SUT defined then you can follow the same process as above. But after that do not forget to update LQUA-LETYP field with respective SUT. However, you can not use this method in case you have SU management active.
    My view with this upload process-  Please use other data uploading method which are not at all time consuming. In more than 90% of most of the business scenarios, storage unit type is used. Hence you will have to perform additional activity of updating LQUA-LETYP-storage unit type field. If this is the case, then why cant you use other uploading method?.
    Thanks,
    Milind

  • MM & WM Inventory Load

    Hi All,
    To load MM & WM inventory I need to fill BMSEG & RLBES structures. Movement Type 561 for Inventory management &  Warehouse Management. I could not find any solution to fill both these structues in one shot (Batch input or BAPI or IDOC).
    Please guide.
    Thanks
    Satish

    Hi,
    For inventory load you can use business object: BUS2017.

  • Inventory load failed... OPatch cannot load inventory for the given Oracle Home.OPatch failed with error code = 73

    Hi,
    I am going to apply bundle patch and my oracle database is 11.2.0.2 and platform Microsoft Windows x86. As per the read me file Oracle recommends that to use OPatch utility release 11.2.0.1.3 or later.To check the accessibility to the inventory i use %ORACLE_HOME%/OPatch/opatch lsinventory command.
    It ended up with giving below details.
    Invoking OPatch 11.2.0.1.1
    OPatch could not create/open history file for writing : ***my oracle home path***\cfgtoollogs\opatch\opatch_history.txt
    **oracle_home***\cfgtoollogs\opatch\opatch_history.txt (Access is denied)
    Oracle Interim Patch Installer version 11.2.0.1.1
    Copyright (c) 2009, Oracle Corporation.  All rights reserved.
    OPatch could not open log file, logging will not be possible
    Oracle Home       : *****
    Central Inventory : C:\Program Files\Oracle\Inventory
       from           : n/a
    OPatch version    : 11.2.0.1.1
    OUI version       : 11.2.0.2.0
    OUI location      : **ORACLE_HOME***\oui
    Log file location : **ORACLE_HOME**\cfgtoollogs\opatch\opatch2015-04-09_16-27-49PM.log
    Patch history file: **ORACLE_HOME**\cfgtoollogs\opatch\opatch_history.txt
    Inventory load failed... OPatch cannot load inventory for the given Oracle Home.
    LsInventorySession failed: LsInventory cannot create the log directory **ORACLE_HOME***\cfgtoollogs\opatch\lsinv\lsinventory2015-04-
    09_16-27-49PM.txt
    OPatch failed with error code = 73
    Can anyone please help me to solve this?
    Do I need to install latest opatch utility?

    Hi,
    this error message "  LsInventorySession failed: LsInventory cannot create the log directory **ORACLE_HOME***\cfgtoollogs\opatch\lsinv\lsinventory2015-04-
    09_16-27-49PM.txt "
    indicates that you have permission issue with your file systems : ORACLE_HOME/cfgtoollogs/opatch
    double check the owner of this file system.
    Regards,

  • Initial Inventory Load - 561 mvmt type, FIFO Valuation

    Hello,
    We have a requirement wherein the client wants to have the Initial Inventory Load (Mvmt 561) to be valuated as FIFO.
    My understanding is, Initial Inventory Load will always be valuated at Std. Price / Moving Avg. Price.
    Can we have Initial Inventory Load valuated using FIFO, if so can you please explain in detail.
    Note: Not every Material being loaded is Batch Managed and am not sure if Batch Mgmt is a requirement for FIFO valuation.
    Thanks for your help.
    Regards,
    Bharat.

    FIFO is a year end valuation,
    you can load initial inventory with 561 and use the external value field to enter whatever value you want, SAP then updates the MAP in material master accordingly.

  • [svn:cairngorm3:] 21174: Landmark does not work under complex situations

    Revision: 21174
    Revision: 21174
    Author:   [email protected]
    Date:     2011-04-29 11:21:00 -0700 (Fri, 29 Apr 2011)
    Log Message:
    Landmark does not work under complex situations
    https://bugs.adobe.com/jira/browse/CGM-39
    Ticket Links:
        http://bugs.adobe.com/jira/browse/CGM-39
    Modified Paths:
        cairngorm3/trunk/libraries/Navigation/src/com/adobe/cairngorm/navigation/core/EnterAndExi tInvoker.as
        cairngorm3/trunk/libraries/Navigation/src/com/adobe/cairngorm/navigation/waypoint/Waypoin tHandler.as

    Hi John,
    1) I like that the new model adds parameterization. It's cleaner than pulling in parameters from pre-set variables. However, the given example didn't actually make much use of it. The only non-constant parameter multiply-used in the example is the "table" variable. Seems like a lot of work for not a lot of gain, at least in this case?
    2) I am cautious that this new template/model condenses the paradigm sooo much, that it is no longer clear where XPath is involved vs straight constant tag names. Yes, Adobe's example is overly-expanded but that's common in code meant to be a demonstration.
    3) I also am cautious that the example intermingles direct node creation into the XPath search/processing chain. I've learned to be VERY careful with this. It only can work when the changes made do not interfere with the rule processing. In my model, I simply avoid it completely (by not making node-position or node-add/remove/move changes until after tree parsing is complete.) This will always be a safe model.
    Bottom line: while I very much appreciate the parameterization and lintability (is that a word? Sure makes sense to me )... I think I would still define each rule separately rather than bring them all together as an inline array in the rule processing call. To me it seems sooo condensed that the XPath meaning can become lost. (Would someone recognize that //para/section-head is actually an XPath statement that could (in another situation) be //para/* or //para/following-siblings::* ... while some of the other strings are exact-match tag names?)
    I realize this is all a matter of style... my preference: clarity for the future reader, particularly when the future reader is me a year later who forgot what all those parameters and embedded methods were all about ...
    Blessings,
    Pete

  • Inventory Load: Zprogram

    Dear All,
    I have a concern on inventory load program we use to convert inventory load from legacy into SAP.
    I also know that the Zprogram what we use calls standard trans code MB11. My question is even without storgae loaction data in material master, can I upload the inventory? I tried using MB11 for 561 mvt. It worked. But in our legacy system we have all other mvt types as well. How do I know for which and all mvts types, it will accept stock load without storage location data?
    Thanks in advance,
    Ganesh

    Hi,
    If your program calling standard transaction, for example MB11 or MB1C, then you can directly load stock without creating any storage location data view.
    This option is there in customization. If you go to SPRO, and then find the path Material Management>Inventory Management>Goods Receipt-->Create automatic storage location data. There you will find for which movement types you can directly load inventory without creating storage location view. System will create storage location data upon first goods receipt. Similarly
    Material Management>Inventory Management>Transfer posting--> Create automatic storage location data, there also you will find for which movements system will create automatically stoarge location data.
    Thanks,
    Ganesh Tiruvail

  • LSMW for IM Inventory Loads

    Hi all,
    I am trying to upload all the Inventory load data through LSMW. I want to know is there any BAPI or Direct/Batch input program we can use in lsmw to upload the inventory data. I found a BAPI Object Type 
    BUS2028 but i missed some fields to map . Is there any other program or BAPI.

    Try Using the BAPI 'BAPI_MATPHYSINV_CREATE'.
    Also there is a Direct input program 'RM07MMBL'.
    Am not sure of your mapping fields. So get into these objects and see if it will be helpful.
    Vinodh Balakrishnan

  • Cutover inventory load and Std. price release

    Dear consultants,
    We are about to go live next week and we are trying to asses the impact of doing the following cutover steps:
    1. load inventory quantities (mov. type 561) on June 30th.
    2. Release cost run on July 1st.
    These 2 steps will be applied for a part of the inventory only as the rest will only be ready for load into SAP a couple of days later, but since we want to create some shipments on July 1st, we would like to release the price of those parts ahead of the complete - all plants - cost estimate release.
    I am worried about 2 things:
    a. Since the entire plant is marked for July 3rd, I will have to re-run the cost estimate of that limited group of materials for July 1st and release it on July 1st and I hope this will not cause errors.
    b. The inventory is loaded in June 30th and the relase of the cost run is done on July 1st - will that cause any problem?
    I would appreciate if people who had similar scenarios going or that have any insight on this situation respond.
    Thanks,
    Yoel.

    Hi,
    If you upload inventory and then release std cost,
    1. It would hit a inventory revaluation a/c
    2. It may happen that your inventory will be uploaded at zero value..
    Will that be ok with you?
    Regards
    Ajay M

  • Loading complex report data into a direct update DSO using APD

    Dear All,
    Recently, I had a requirement to download the report data into a direct update DSO using an APD. I was able to perform this easily when the report was simple i.e it has few rows and columns. But I faced problems If the report is a complex one. Summing up, I would like to know how to handle the scenarios in each of the following cases:
    1.   How should I decide the key fields and data fields of the direct update DSO ? Is it that the elements in ROWS will go to the
          key fields of DSO and the remaining to the data fields? Correct me.
    2.   What if the report contains the Restricted KFs and Calculated KFs? Do I have to create separate infoobjects in the BI
          system and then include these in the DSO data fields to accommodate the extracted data ?
    3.   How do I handle the Free Characteristics and Filters ?
    4.  Moreover, I observed that if the report contains selection screen variables, then I need to create variants in the report and
         use that variant in the APD. So, if I have 10 sets of users executing the same report with different selection conditions, then
         shall I need to create 10 different variants and pass those into 10 different APDs, all created for the same report ?
    I would appreciate if someone can answer my questions clearly.
    Regards,
    D. Srinivas Rao

    Hi ,
    PFB the answers.
    1. How should I decide the key fields and data fields of the direct update DSO ? Is it that the elements in ROWS will go to the
    key fields of DSO and the remaining to the data fields? Correct me.
    --- Yes , you can use the elements in the ROWS in the Key fields,  but in case you get two records with same value in the ROWS element the data load will fail. So you basically need to have one value that would be different for each record.
    2. What if the report contains the Restricted KFs and Calculated KFs? Do I have to create separate infoobjects in the BI
    system and then include these in the DSO data fields to accommodate the extracted data ?
    Yes you would need to create new Infoobjects for the CKF's and RKF's in the Report and include them in your DSO.
    3. How do I handle the Free Characteristics and Filters ?
    The default filters work in the same way as when you yourself execute the reoprt. But you cannot use the Free characterisitics in the APD. only the ROWS and cloumns element which are in default layout can be used.
    4. Moreover, I observed that if the report contains selection screen variables, then I need to create variants in the report and
    use that variant in the APD. So, if I have 10 sets of users executing the same report with different selection conditions, then
    shall I need to create 10 different variants and pass those into 10 different APDs, all created for the same report ?
    --- Yes you would need to create 10 different APD's. Its very simple to create, you can copy an APD. but it would be for sure a maintance issue. you would have to maintain 10 APD's.
    Please revert in case of any further queries.

  • URGENT : Please help: Purchasing and Inventory loads

    I am currently in BWQ. I did the inital deletion of set up tables by LBWG and then ran the set up jobs through SBIW for Purchasing and Inventory.
    I also did the industry setting and processkey .
    I ran the init load yesterday and it gave some records and then deltas brought 0 records . For some reason I felt that there should not be 0 records and I went and deleted the set up table again and ran the set up jobs. I can see records in the set up table and RSA3.
    But when I am running the load today it brings 0 records. I thought may be the delta ran this morning so it brought 0 records. therefore I ran the full repair reuqest but still there only 0 records.
    Can someone explain this and advice what needs to be done.
    Thanks

    Hi BI TCS
    Initially when you loaded init data you got some records and delta brought 0 records. This is logical if there are no new records creates in R/3 system or you have not executed delta collector job - Job control through LBWE for a particular application.
    Now carry out following steps for purchasing data loads -
    1.delete the setup tables using tcode LBWG
    2.perform set up for the purchasing in background mode
    3.check the job log for step 2 in SM27
    4. find out no of documents written to setup tables for purchasing using tcode NPRT
    5.Now load the init load to BW- this will load the data that you have covered under setup tables
    6. Now create/change some data in R/3. This data must be withing your init range used in init infopackage. If you have done init load with no selection parameters then no problem.
    7. Run job control for purachasing in LBWE
    8. Now load the delta load -> it should load the correct delta data into BW.
    Hope this resolves
    Regards
    Pradip

  • Handling Unit Initial Inventory load help

    Hi Experts,
    I need to upload Handling Unit Inventory using RHUWM_INITIAL_LOAD.  Could some one who had done this before can help me providing more information on this?  How can I load a data file from my laptop?  What should be the data file format?
    Any help on this would be greatly appreciated.
    Thank you
    with regards,
    Muthu Ganapathy

    Hi Jurgen,
    Thank you for your time and for reply.  I'm not a developer.  If you could give me more instruction how can I specify my local file to test the RHUWM_INITIAL_LOAD will be helpful?  Appreciate any input.
    Thank you.
    with regards,
    Muthu Ganapathy.

  • Inventory loads to cube 0IC_C03

    Hi Guys ,
    I am loading data to Inventory Cube 0IC_C03 .I have  done all the steps .I have 1 question ie ..intilally i loaded data to cube from  2LIS_03_BF data source .I did the following steps :
    u2022 Init Delta (2LIS_03_BF) with data transfer.
    u2022 Release Request with No Marker Update (Tick).
    Now from tomorro we will get delta loads .My question is do we need to compress delta request with NO MARKET UPDATE (TICK) ?
    Or
    Do i need to unTick The option NO MARKER UPDATE .
    The data is very critical here .Please let me know if you have the right answer .
    Regards
    Santosh

    Hi,
    Pl look at below
    Marker Update is used to reduce the time of fetching the non-cumulative key figures while reporting. It helps to easily get the values of previous stock quantities while reporting. The marker is a point in time which marks an opening stock balance. Data up to the marker is compressed.
    The No Marker Update concept arises if the target InfoCube contains a non-cumulative key figure. For example, take the Material Movements InfoCube 0IC_C03 where stock quantity is a non-cumulative key figure. The process of loading the data into the cube involves in two steps:
    1) In the first step, one should load the records pertaining to the opening stock balance/or the stock present at the time of implementation. At this time we will set marker to update (uncheck 'no marker update') so that the value of current stock quantity is stored in the marker. After that, when loading the historical movements (stock movements made previous to the time of implementing the cube) we must check marker update so that the marker should not be updated (because of these historical movements, only the stock balance / opening stock quantity has been updated; i.e. we have already loaded the present stock and the aggreagation of previous/historical data accumulates to the present data).
    2) After every successful delta load, we should not check marker update (we should allow to update marker) so that the changes in the stock quantity should be updated in the marker value. The marker will be updated to those records which are compressed. The marker will not be updated to those uncompressed requests. Hence for every delta load request, the request should be compressed.
    Check or uncheck the Marker Option:
    Compress the request with stock marker => uncheck the marker update option.
    Compress the loads without the stock maker => check the marker update option.
    Relevant FAQs:
    1) The marker isn't relevant when no data is transferred (e.g. during an delta init without data transfer).
    2) The marker update is just like a check point (it will give the snapshot of the stock on a particular date when it is updated).
    Reference information:
    Note 643687 Compressing non-cumulative InfoCubes (BW-BCT-MM-BW)
    Note 834829 Compression of BW InfoCubes without update of markers (BW-BEX-OT-DBIF-CON)
    Note 745788 Non-cumulative mgmnt in BW: Verifying and correcting data (BW-BCT-MM-BW)
    Note 586163 Composite Note on SAP R/3 Inventory Management in SAP BW (BW-BCT-MM-IM)
    Thanks and regards

  • Inventory Load

    Hi
    I am loading data for inventory .. I have 3 dataources BX, BF and UM.
    My question is once we complete setup table filling and loading into BI, shall we delete the contents of application area '03' every time before proceeding to setup table filling for next datasource e.g for BF after filling BX.
    Kindly advice.

    These three datasources are related to inventory. BF is for opening stock, BX is for material movements and UM is valuations. Before loading this data into BW you need to maintain some settings in R/3 side.
    According to SAP NOTE:315880.T-code BF11.Activate the flag NDI will reflect in table "TBE11" For the entry "NDI".
    2. set the application indicator 'BW' to active using Transaction 'BF11' . which will make tick mark(x) for the field AKTIV for the APPLK ='BW' in table TBE11.
    3. Activate the PROCESSKEY under "Maintain Industry Sector"
    4. Generated "STOCK INITIALIZATION" for 2LIS_03_BX.
    5. Filled setup tables for "INVENTORY CONTROLLING"
    6. Scheduled the InfoPackage for 2LIS_03_BX with update mode "GENERATE INITIAL STATUS.
    then load BF and UM
    There is a PDF available how to load inventory..just search for it in SDN.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/0ea8a990-0201-0010-2796-fa9ae8691203

Maybe you are looking for