Inventory - Historical load from BF to 0IC_C03 changes current stock qty

Loaded 0IC_C03 from 2LIS_03_BX to create opening balance.
Compressed with "Marker Update" (Unchecked). Ran a report to display opening stock (0TOTALSTOCK).
Report matches perfectly with MMBE in source system.
Next, I loaded historical movements from 2LIS_03_BF. Compressed without "Marker Update" (Checked).
Ran the stock overview report. The stock balances are now changed. Historical movements
should not change current stock positions.
Steps leading upto this issue (ECC/BI 7.0):
1. Loaded data from BX setup to PSA and then to cube using a DTP with the "Initial Non-cumm for Non-cumm" option. Compressed with "Marker Update".
2. Init BF without "Data Transfer" to create my delta pointer and then loaded BF setup table data to the PSA using "Full Repair request". Created 1 request per year based on posting date year.
3. Then pushed data from PSA to cube using "Delta" DTP. Compressed all historical requests WITHOUT "Marker Update" (checkbox ticked).
Any suggestions ? I have some sort of a workaround for now - the BX records in the cube do not contain values for the Material Document Number (MBLNR). I can put a charateristic filter in the query to display only records that have a null value for Mat Doc Number. In that case the report seems to be correct as I am eliminating the records from BF.
Regards,
Ash

HI,
Are you trying to install something which does not have their corresponding Infoarea, Infoproviders etc..? This might be the reason for those errors.
Also You should use Collect objects automatically.
Regards,
Rahul S

Similar Messages

  • Inventory data load from Inventory Cube to another Copy Cube

    Hello Experts,
    I am trying to load Inventory data from the Inventory cube(say YCINV) to a copy cube  say YCOPY_CINV(copy of Inventory cube), but the results appear inconsistant when I compare the reports on these 2 cubes. I am trying to populate a fiield in copy cube so that I can populate the same data back to the original cube with the new field in it, I am doing this reload back and forth for historical data purpose only.
    I have seen lot of posts as how to run the setups for Inventory data, but my case does not need to perform set up runs.
    Does the note 1426533 solve the issue of loading from one cube to another ? we are on SAP BI 7.01 with SP 06 ,but the note specifies SP 07 ?
    I have tried note 375098 to see if it works, but I do not see the options as mentioned from  step "Using DTP (BW 7.x)" in BI to perform this note
    Please advise on whether to go with implementing note 1426533 or is there any other way to load inventory data from one cube to other.
    Regards,
    JB

    Hi Luis,
    Thanks for your reply,
    I do not see any setting like Intial stock in DTP except "initial non-cumulative for non- cumulative". I did try using the option "initial non-cumulative for non- cumulative" ,but the results still do not match with the inventory cube data.I do not see the check box for marker in the copy cube (under roll up tab). Please let me know if we can really implement this solution ,i.e. copying from inventory cube to copy cube and then re-loading it back to inventory cube for historical data. Currenlty, I am comparing the queries on these 2 cubes, if the data matches then I can go  ahead and implement it in Production, other wise it would not be wise to do so .
    Regards,
    JB

  • Changes in stock value of the material even after closing period

    The client closes the accounts every month end. Say they close the books of accounts for the month of Jan’07 on 31st Jan 2007. After closing the account they are running MC.9 to know the stock value of different plants for the previous month. i.e.  for the month of Jan’07. But every day when the client runs the MC.9 they found some differences in the stock price & quantity.
    For better understanding, consider they close the period on 31st Jan. When they run the MC.9 on 2nd Feb it shows the value say 100$ and qty of 100. When we run the same report on 4th Feb the value got changed to say 98$ and qty remain same. Again when we run on 6th Feb the value is changed again to say 102$ and stock is 100.
    We checked the MB51 with posting date & entered date. We could not find any back posting.
    What could be the reason for the  variation in the stock price & qty ? Any pointers will be highly appricated.

    Hi Venkat,
    Your previous posting talked about variation in stock value and now you are saying that there are changes in stock qty also. So the better option would be to check those material which has variation and compare them with other standard reports to get a solution. Export the list of material to an excel sheet and compare them with that of the other reports. This would help you to find the materials which has changes and you can narrow down to the root cause. Once you get the material then u can go by posting date and find the movements.
    Hope this helps. Reward if u find it useful.
    regards
    Anand.C

  • How to Load Historical values from Material Prices

    Hi
    I have to load historical values from Material Prices.
    I'm using extractors 0CO_PC_ACT_02 and 0CO_PC_ACT_05.
    Do you know how I can do this in a easy way?
    Help is appreciated.
    thanks in advance.
    best regards

    Hi Joao,
    This extractor0CO_PC_ACT_05 takes the data from table MBEW. ( data for the current period, previous period and last period of the previous year )
    If material ledger is active, then it takes the information from table ckmlhd for those materials.
    It does not look like fetching from MBEWH.
    Please validate my above comments with sample extraction for a material and plant.
    If that is the case, I would suggest you have to create a custom extractor based on MBEWH to get the historic price data and with 0CO_PC_ACT_05 you can get the prices for the previous period as they get completed .
    0CO_PC_ACT_02 gets you the inventory value and not the price in the same way.
    You might have to think about 0CO_PC_ACT_06 and 0CO_PC_ACT_07, if you have project stock and sales order stock in your business scenario.
    Please let me know, how this goes.
    Thanks,
    Krishnan.

  • Historical data from ACE20 load balancer modules

    Hello,
    I am trying to get some historical data from our ACE20 load balancer modules which are housed in a 6504, but cannot seem to find where to get this information from. I would like to get information such as memory used, CPU active connections, throughput, http requests etc over the past month, is there any way that I can do this? I am using ANM 2.0
    Thanks

    Hello,
    Below are teh steps to load from DB connect.
    Just go to Rsa1->modeling->source system->double click on the source system. On right habd side side right click on the topmost node.now it will take u to a screen, there u can give the table name (if u know) or just execute. it will display all the tables u have in Oracle. there double click on th eselected table and select the fields requared and now generate data source. now come back to RSA1->source system-> replicate the data source. now assign infosouce as usual.And create infopackage and start the load.
    And this doesn't support delta process.
    This document could help:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2f0fea94-0501-0010-829c-d6b5c2ae5e40
    Regards,
    Dhanya

  • Historical Data with it's delta loading from PSA to ODS in 3.x model

    Hello ,
    I want to load the one year back historical data with it's delta's to ODS object in BW 3.x model .
    So can you please give steps to load historical data from PSA table into ODS object(not from the source system).
    Thnaks alot
    Regards
    BI Beginner

    Hi
    Run the full load IP from PSA to ODS with selections on 0CALDAY(give one year selections)
    make this full load as repair full request.In display mode of IP, in menu bar click on scheduler --> select repair full --->check the option and click ok.
    now execute the IP.
    If you run like this, your init which has between your PSA and ODS will no get disturb.
    Regards,
    Venkatesh.

  • The latest version of ITunes has totally changed how I load music or photos on my IPod and IPhone.  It used to be I could highlight a song list  or cd from my library and drag it to the i pod icon and it would load.  Now, it's changed and I can't fig

    I have a Macbook Pro operating system OSX 10.7.5  My Itunes version is 12.01.26.  Here's the problem:  with the older versions of ITunes I could connect a device and highlight and drag whatever music I wanted to load from my library to the device.  Now, there's a different process which I cannot fathom.  I don't want to sync my i-pod every time i want to add music; i don't want to restore my iPhone to do the same or to add photographs which were sent to me and then uploaded  to my computer.  Can anyone out there give me a step by step process on how to do these what used to be very simple procedures.  Thanks in advance.  Also, how do I contact someone in Apple who deals with consumer complaints?

    Hi Edward,
    You should still be able to manually manage the music on your iPhone within iTunes. Make sure that when your iPhone is plugged in, look in the Summary tab to see if “Manually manage music and videos” is enabled and you can even you turn off Automatically Sync when this iPhone is connected. Take a look at the first article below to give more information on that. As far as your other questions about calling someone, I have provided two options for you. If it is for feedback on iTunes, you can use the second link to give detailed feedback on what is going on. If you want to actually talk to someone, the last link has a list of numbers for you to call for any support or service that you may need.  
    Manage content manually on your iPhone, iPad, and iPod
    http://support.apple.com/en-us/HT201593
    Product Feedback
    http://www.apple.com/feedback/
    Contact Apple for support and service
    http://support.apple.com/en-us/HT201232
    Take it easy,
    -Norm G. 

  • Footage changes once color project is loaded from disk

    Hi all,
    I have a big problem in Color 1.0.4.
    I send a 20 minute edit from FCP to Color. All is fine. The edit is the first part of a marriage service. A shoot taken with two camera.
    I did originally put the sequence together using Multi-Cam, but then I replaced all the clips with the originals as I had problems when I first went into color with this.
    The problem is that one of the angles appears to have lost the information as to where the INOUT points are...This happens once the color project has been loaded from disk. (ALL is fine until I shut Color and reload the project). The sections in the sequence where this camera is used always plays the source file for that camera from the beginning. (A shot from outside the church!) This is causing me all sorts of problems and I cannot seem to get past this. I need to save the work and then re-open in FCP to see what the grade looks like as I don't have an external monitor.
    Any help with this would be appreciated.
    The thumbnails show the correct footage from the camera. Adding a section to the render queue results in a render of 1 frame. Even though the IN and OUT points on the Color timeline show roughly 20 seconds, the render view shows an in and out point one frame apart...
    Thanks in advance,
    Richard
    Message was edited by: richyN

    What is/are your sequence codec/settings?
    I know I get this sort of behavior on my external monitor if I've got a Kona export settings mismatch, but not on the timeline.
    When you observe the application running back to "first frame" on park, but finds the correct images on play, sort of infers that there is a broken reference somewhere in how FCP is recalling the media.
    Bad container? frame rate mismatch?
    But before you re-edit, can you save the effort to this point by "Baking out" a movie of the sequence?
    It would be at least interesting to know if QT finds the right media at your edit points. At that point, a fairly straightforward movie could be razored and sent to COLOR...
    jPo

  • Inventory Cube Loading - Questions....

    This is the process I intend to follow to load InfoCube 0IC_C03 and the questions therein. I have the "how to handle inventory scenarios" document, so please don't suggest I read that.
    1A) Delete Set-up tables for application 03
    1B) Lock Users
    1C) Flush LBWQ and delete entries in RSA7 related to application 03 (optional - only if needed)
    2A) Fill set up tables for 2LIS_03_BX. Do Full load into BW Inventory Cube with "generate initial status" in InfoPackage.
    2B) Compress request with a marker update
    3A) Fill set up table for 2LIS_03_BF with posting date 01/01/2006 to 12/31/2007 (Historical) and load to Inventory Cube with a full update
          QUESTION1: Does this need a marker update?
          QUESTION2: Is this a good strategy  - do movement documents that old get updated once posted?
          QUESTION3: Does the posting dates restriction depend on on how far back in history we want to go and look at stock values?
    3B) Fill set up table for 2LIS_03_BF with posting date 01/01/2008 to 9/9/999  (Current) and load to Inventory Cube with a delta init with data transfer.
    3C) Compress load in 3B without a marker update
    4A) Fill set up table for 2LIS_03_UM  and load to Inventory Cube with a delta init
    4B) Compress load in 4A without a marker update
          QUESTION4: How should we select the posting date criteria? Do I need to load from 01/01/2006 to 9/9/9999 since that's the range used for BF?
    5) Start V3 update jobs via Job Control
    6) Intiate subsequent delta loads from BF and UM and compress with marker update
    QUESTION 5: Is the sequence of loading BX first, then BF and UM fixed? or can I do BF and then BX, UM
    QUESTION 6: Any tips on minimizing downtime in this particular scenario? Please don't suggest generic steps. If you can suggest something specific to this situation, that'd be great.
    I hope you can help with the 6 questions I asked above.
    Regards,
    Anita S.

    Hi Anita,
    Please find my answers below. I have worked enough with this scenario and hence feel that these would be worth considering for your scenario.
    3A) Fill set up table for 2LIS_03_BF with posting date 01/01/2006 to 12/31/2007 (Historical) and load to Inventory Cube with a full update
    QUESTION1: Does this need a marker update?
    In this step we dont need marker update while compressing.
    QUESTION2: Is this a good strategy - do movement documents that old get updated once posted?
    I am able to get the question quite clearly.
    QUESTION3: Does the posting dates restriction depend on on how far back in history we want to go and look at stock values?
    Yes. We need to start from latest and then go back as backwards as we want to see the stock values. This holds true when we are using non cumulative key figures
    4B) Compress load in 4A without a marker update
    QUESTION4: How should we select the posting date criteria? Do I need to load from 01/01/2006 to 9/9/9999 since that's the range used for BF?
    No need to provide any selection criteria for UM while loading to BW, as this would fetch the same data filled in setup of revaluations.Unless you are looking for only some history other wise you can fill that for company code list and bring the whole data to BW  by a single full load, as the data wont be that huge as compared to BF
    6) Intiate subsequent delta loads from BF and UM and compress with marker update
    QUESTION 5: Is the sequence of loading BX first, then BF and UM fixed? or can I do BF and then BX, UM
    This is fixed in terms of compression with marker updates.
    QUESTION 6: Any tips on minimizing downtime in this particular scenario? Please don't suggest generic steps. If you can suggest something specific to this situation, that'd be great.
    *Yes. Most of time consuming activity is for filling BF history. Negligable for BX and UM comparitively.
    Either try having multiple BF selective setup filling based on posting date based on available background processes or filli open months setup during downtime and rest history once you unlock the system for postings. The reason for this being we dont expect any posting for history*
    Feel free to ask any further questions about this.
    Naveen.A

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

  • Error while loading from PSA to Cube. RSM2-704 & RSAR -119

    Dear Gurus,
    I have to extract about 1.13 million records from set up tables to the MM cube 0IC_C03. While doing, the following two errors occured while loading from the psa to the cube:
    Error ID - RSM2 & error no. 704: Data records in the PSA were marked for data package 39 . Here there were 2 errors. The system wrote information or warnings for the remaining data records.
    Error ID- RSAR & No. 119: The update delivered the error code 4 .
    (Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM )
    I tried to change the defective records in psa by deleting the erraneous field value EMN and tried to load but failed.
    Now, my questions are:
    How can I resolve the issue ?
    (ii) How to rectify the erraneous record ? should we only delete the erraneous field value or delete the entire record from the psa.
    (iii) How to delete a record from psa.
    Thanks & regards,
    Sheeja.

    Hi,
    Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM
    The issue with record no. 5129 and 5132.
    In PSA check errorneous records and it will display only the error records, you just edit as per the require and try reloading into cube.
    Deleting single record is not possible.
    Let us know if you still have any issues.
    Reg
    Pra

  • Tracking history while loading from flat file

    Dear Experts
    I have a scenario where in have to load data from flat file in regular intervals and also want to track the changes made to the data.
    the data will be like this and it keep on changes and the key is P1,A1,C1.
    DAY 1---P1,A1,C1,100,120,100
    DAY 2-- P1,A1,C1,125,123,190
    DAY 3-- P1, A1, C1, 134,111,135
    DAY 4-- P1,A1,C1,888,234,129
    I am planing to load data into an ODS and then to infocube for reporting what will be the result and how to track history like if i want to see what was the data on Day1 and also current data for P1,A1,C1.
    Just to mention as i am loading from flat file i am not mapping RECORDMODE in ODS from flat file..
    Thanks and regards
    Neel
    Message was edited by:
            Neel Kamal

    Hi
    You don't mention your BI release level, so I will assume you are on the current release, SAP NetWeaver BI 2004s.
    Consider loading to a write-optimized DataStore object, to strore the data.  That way, you automatically will have a unqiue technical key for each record and it will be kept for historical purposes.  In addition, load to a standard DataStore object which will track the changes in the change log.  Then, load the cube from the change log (will avoid your summarization concern), as the changes will be updates (after images) in the standard DataStore Object.
    Thanks for any points you choose to assign
    Best Regards -
    Ron Silberstein
    SAP

  • Inventory delta load problem

    Hi,
         We are facing inventory data load getting below error, Main problem is they got below error in PSA level from 16th june still they changed red icon into green and loaded to incube and compressed. when running process chain they getting same error daily.
    Error:
    Caller 09 contain an error message
    Problem:
    -> Data flow PSA->IC, 20th request (it look like have 16th to 20th data)  they have changed PSA red request into green and loaded to cube and compressed.
    21th request (it look like have 16th to 21th data)  they have changed PSA red request into green and loaded to cube and compressed.
    Kindly guide me what solution we can give.
    thanks

    I suggest that you handle this with utmost care. Since it does not look like you have the DSO in the dataflow, recovering the data would be impossible if the cube data is corrupted.
    Need to understand the details to be able to give a proper solution.
    You can consider request reverse posting option which will post reverse images of the PSA requests under the scope and nullify the value from the cube. Please try this in Dev and quality system, before trying this in Production.
    Else, you can create a custom datasource based on this PSA table and reverse the values in the transformation.
    If you could identify the documents, created after this delta corruption, you could set them up in source and do a repair full upload.
    Please let me know whether it helps.

  • Data loading from ODS toODS

    Hi All,
    Iam having one source ODS and target also ODS.Can u explain when we have to go for
    1. Full Repair.
    2.Full Load.
    3. Delta Load.
    and also pls give the procedure for loading for all the three above said scenarios.
    Points will be awarded suitably for the answers.
    Thanks in advance gurus.
    Regards...
    BW World

    Hi,
    For loading from one ODS to another you use export datasource. Its called a datamart load.
    Full load is generally performed once to load all the historical data.
    After that you do an init load and deltas will follow which will load only the changed or new records frequently.
    Full repair can be said as a Full with selections. But the main use or advantage of Full repair load is that it wont affect delta loads in the system. If you load a full to a target with deltas running you again will have to initialize them for deltas to continue. But if you do full repair it wont affect deltas.
    This is normally done we when we lose some data or there is data mismatch between source system and BW.
    Check the OSS Note 739863 'Repairing data in BW' for all the details
    You need to create an export datasource (8<Source ODS>) and the update rules from Source ODS to target ODS for loading.
    Also InfoPackages for Init, Full/Full repair and Delta loads.
    Loading procedure had been discussed previously in SDN.
    You will get detailed info in SDN.
    Thanks,
    JituK

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

Maybe you are looking for