Historical Data Update

Hi,
We need to migrate historical data such as PO for our client. Our client wants to update the data from 2008 till date. My question to you all is that what should be the MM period i.e. whether the in OMSY the MM period be kept 01/2008 or I can keep it as the current period.
Please suggest.

Dear Challenges,
I am assuming that go-live has been done.
For example: If the Posting Periods are of November and December, say go-live is done on 30th November.
Then you can upload PR or PO or any other Puchasing document by mentioning any date, but stock need to be uploaded only on the date before 30th of November. Later GR can be done as of any date in the Posting period range.
For example: If the go-live has been done in some where around August and Posting periods open are of November and December. Same as above case all the Purchasing documents can be uploaded as of any date. But GR need to be done only on the posting periods that are available, But reverting the posting periods has a very worst implications. Kindly refer to OSS notes for that.
If you have any queries, Kindly revert me with full details of your current situation.
Regards
Lakshminath Gunda

Similar Messages

  • Update GL accounts (historical data) with new Business Area values

    Hi experts,
    Situation:
    One company code with multiple plants already productive w/o Business Area.
    Business Area would be implemented soon to internally differenciate the company code.
    What is needed:
    To update with various values the GL account balances with a proper Business Area, as required, for instance starting on a new year. Something like executing a balance carry forward and be able to add the business area.
    Purpose:
    For internal analysis and reporting to be able to compare data without having to deal with historical data having the field Buisness Area 'blank' .
    Question:
    Does any body knows if a tool or program is available to do so?
    Your responses are greatly appreciated.
    Thanks,
    GG

    Hi,
    this message cannot be changed (OBA5) nor a new analysis period for the costcenter is helpful if any plan / actual data is posted already in this fiscal year.
    If its not only test data (which can be deleted and than the change can be done) the only way is to add the business area in cost center master by maintaining CSKS-table directly if you are allowed to do so...
    If there are balance sheet items (eg assets assigned to such a cost center, the balance sheet postings from asset accounting are posted with the busines area from cost center master) you have to repost them in GL accounting otherwise business area wise balance sheet is wrong.
    Best regards, Christian
    Edited by: Christian Ortner on Mar 23, 2010 12:15 PM

  • Init Load,Historical Data Load & Delta Load with 'No Marker Update'

    What is the difference b/w Compression in Inventroy cube while Init Load,Historical Data Load & Delta Load with 'No Marker Update'???
    please help for this quesy..
    Thanks
    Gaurav

    Hi Gaurav,
    I believe you will find the answers here...
    [http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e0b8dfe6-fe1c-2a10-e8bd-c7acc921f366&overridelayout=true]
    regards
    Pavel

  • Remote historical data is not retrieved completely viewing it in MAX4

    Hi,
    since I installed LabVIEW 8 I have some problems retrieving historical data from another computer. Sometimes not all data is retrieved (if I zoom in or out or move back in time) and this missing data won't be retrieved ever.
    I already deleted the Citadel cache once, but after this even less data was retrieved... What's really weird, is, that for channels which weren't retrieved correctly, the data gets not updated anymore!
    On the remote computer I have a LabVIEW DSC Runtime 7.1 running (MAX 3.1.1.3003) on my local computer MAX 4.0.0.3010 and LabVIEW 8 DSC (development system) is installed parallel to LV DSC 7.1.1 (dev system). LV 8 is installed for testing purposes (I doubt we'll switch soon) and overall I like MAX 4. The HyperTrend.dll on my local computer is version  3.2.1017.
    This is really a quite annoying bug!
    So long,
        Carsten
    Message Edited by cs42 on 02-02-2006 09:18 AM

    Hi,
    > We've been unable to reproduce this issue. If you could provide some additional information, it might help us out.
    I did fear this, as even on my computer it is happening just sometimes...
    > 1) How many traces are you viewing?
    The views I observed this in had 2 to 13 traces.
    > 2) How often are the traces being updated?
    For some it's pretty often (about once a second), for some it's very infrequent (no change in data, that means updated because of max time between logs). I more often see this for traces that are updated very infrequently. But I think I've seen this for frequent traces as well (for these it does work currently).
    > 3) Are the traces being updated by a tag value change, or by the "maximum time between logs" setting in the engine?
    It happened for both types.
    > 4) What is the frequency of the "maximum time between logs" setting?
    Max time between logs is 10 minutes.
    > 5) Is the Hypertrend running in live mode when you zoom out/pan?
    I think it happened in both modes, but it defenitely did in live mode.
    > 6) If you disable/re-enable live mode in the Hypertrend, does the data re-appear?
    I couldn't trigger the loading of the data. All I did is wait and work with MAX (zooming, panning, looking at data) and after quite a while (some hours), the data appeared.
    Just tested this on a view where data is missing (for some days now!), and it didn't trigger data reloading. Zooming and panning don't as well. There's a gap of up to 3 days now for some traces. 7 of the 13 traces of this view are incompletely shown. All stopping at the same time but reappearing at different ones.
    AFAIR from the laboratory computer (these are temperatures and it's very plausable that these didn't change), there wasn't any change in these traces so they all got logged because of max time...
    I just created a new view and added these traces: the gap is there as well.
    (Sorry to put this all in this entry even if it is related to you other questions, but I started this live test with disable/re-enable live mode. )
    > 7)
    Are the clocks on the client and server computers synchronized? If not
    synchronized, how far apart are the times on the two computers?
    They should be (Windows 2000 Domain synchronized to ADS), but are 5 seconds apart.
    One thing I remember now: I have installed DIAdem 10 beta 2 (10.0.0b2530, USI + DataFinder 1.3.0.2526). There I had (and reported) some problems with data loading from a Citadel Database of a remote machine as well. This was accounted to some cache problem. Maybe a component is interfering?
    Thanks for investigating.
    Cheers,
        Carsten

  • How to populate historical data in the appraisal

    Hello,
    I am working in Oracle PMS module. Our requirement is to display historical data of an employee in the next appraisal cycle.
    Eg. If there are 3 appraisal cycle in a year. In the first appraisal manager will add objectives and competencies for an employee. lets say he added competency 1 ,competency 2,Objective 1 and Objective 2. after completion of the first cycle.
    When manager initiates the second appraisal. in the objectives block 2 values (Objective 1 and Objective 2) should be auto populated. Same as in competency block 2 values (competency 1 and competency 2) should be auto populated.
    Here manager can add more objectives or competencies.
    Please suggest how this can be achived.
    Thanks in advance,
    sheetal

    Hi Shunhui
    Answers :-
    If the required fields are not available in the Maintenance link, does it mean that I have to goto SE11 and create an append structure for the extract structure MC02M_0ITM which belongs to the datasource 2LIS_02_ITM?
    Then I have to write codes in the user exit for transactional datasources in RSAP0001?
    Ans : That's correct . Once you have done this on R/3 side, replicate the data source in BW side. Activate the transfer rules with appropriate mapping for 3 new fields and also adjust the update rules so that these 3 new fields are availble in Info providers.
    Are you able to provide some information on the errors that might arise during transports with the delta being active in the production system? Will data be corrupted? Will I need to clear the delta queue before the transport goes over?
    Ans : I assume that deltas are active in Production system . There is risk to your active delta into BW .
    Before your R/3 transports reach Production system , you need to clear the delta queue that means bring the number of LUWs for 2LIS_02_ITM to zero . Also stop the delta collector job in R/3 for Application 02(purchasing). Make sure that there are no postings (transactions /users are locked) so that there will be change in purchsing base tables.
    Then send your R/3 transport into Production system (Append structure , new fields + user exit code) .Replicate on BW side. Now send the BW transports to BW production system .
    I had done this earlier for 2LIS_12_VCITM and had a step by step proceduew written with me ....will have to search for that document . Let me know your email ID , will try to send it once I find the document
    Regards
    Pradip
    (when you edit the questions , there are option buttons with which you can assign points for that you need to do log on to SDN )

  • Reloading Historical Data into Cube: 0IC_C03...

    Dear All,
    I'm working on SAP BW 7.3 and I have five years of historical data being loaded into InfoCube: 0IC_C03 (Material Stocks/Movements (as of 3.0B)) and the deltas are also scheduled (DataSources: 2LIS_03_BF & 2LIS_03_UM) for the InfoCube. I have new business requirement to reload the entire historical data into InfoCube with addition of "Fiscal Quarter" and I do not have a DSO in the data flow. I know its a bit tricky InfoCube (Inventory Management) to work on and reloading the entire historical data can be challenging.
    1. How to approach this task, what steps should I take?
    2. Drop/Delete entire data from InfoCube and then delete Setup Tables and refill these and then run "Repair Full Load", is this they way?
    3. What key points should I keep in mind, as different BO reports are already running and using this InfoCube?
    4. Should I run "Repair Full Load" requests for all DataSources( 2LIS_03_BX &2LIS_03_BF & 2LIS_03_UM) ?
    I will appreciate any input.
    Many thanks!
    Tariq Ashraf

    Hi Tariq,
    Unfortunately, you will need some downtime to execute a stock initialization in a live production system. Otherwise you cannot guarantee the data integrity. There are however certainly ways to minimize it. Please see SAP Note 753654 - How can downtime be reduced for setup table update. You can dramatically reduce the downtime to distinguish between closed and open periods. The closed periods can be updated retrospectively.
    To make it more concrete, you could consider e.g. the periods prior to 2014 as "closed". You can then do the initialization starting from January 1, 2014. This will save you a lot of downtime. Closed periods can be uploaded retrospectively and do not require downtime.
    Re. the marker update,please have a look at the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x, step 10, 11, 12 and 13 contain important information.
    Re. the steps, it looks OK for me but please double check with the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x.
    Please have a look at the following document for more background information around inventory management scenarios:
    How to Handle Inventory Management Scenarios in BW (NW2004)
    Last but not least, you might want to have a look at the following SAP Notes:
    SAP Note 436393 - Performance improvement for filling the setup tables;
    SAP Note 602260 - Procedure for reconstructing data for BW.
    Best regards,
    Sander

  • Non-cumulative initialization is not possible with historical data

    Hi all,
    While loading inventory cube.. 0IC_C03  .. after loading 2LIS_03_BX request get sucessful. while compress that request No Marker update. i getting following error msg.
    <b>Non-cumulative initialization is not possible with historical data</b>     
    Regards
    siva.

    You sure you didn't use BF instead of BX?  This messge indicates that you have historical data in your cube whereas BX only loads a fixed state at a specific point in time...  Or maybe, did you initialize BX twice in R/3 without deleting the previous init, I don't know if that could cause that error but it's another possibility.

  • Master data update bringing in multiple records for single entry

    Hello,
    I found that if I deleted and reloaded data in a master data object, it was showing me correct number of records. However, just a full update was not overwriting it with correct number of data records. there were always 1 or 2 extra blank records. I thought master data update should overwrite previous records and we need not delete and reload it.
    Why would this happen?
    Thanks!

    Hi there,
    Are you refering to time dependent master data? From to values are present?
    If so, that's the correct situation, since BW (OLAP) instead of R/3 (OLTP) allows you to mantain historical data, i.e., a material can be valid from a specific time period with several attributes and other attributes can be valid for the same material in another time space.
    Diogo.

  • How to recreate EBS user and keep all his historical data.

    Hi all
    We have a user that is having an issue seeing any of his scheduled Discoverer reports within the Schedule Manager window of Discoverer Plus; Discoverer Desktop works fine.
    The solution for it's to recreate the EBS user. The problem with this is that, if we recreate the EBS user, he will lose all historical data connected to that user, including the results of the scheduled Discoverer reports as well as all of the EBS created/last updated information.
    There is a way to recreate an EBS user and preserve the historical references.
    Thanks

    We have a user that is having an issue seeing any of his scheduled Discoverer reports within the Schedule Manager window of Discoverer Plus; Discoverer Desktop works fine.
    The solution for it's to recreate the EBS user. The problem with this is that, if we recreate the EBS user, he will lose all historical data connected to that user, including the results of the scheduled Discoverer reports as well as all of the EBS created/last updated information.Why do you need to recreate the user?
    Are you saying you are going to create a new username for the same user and end-date the old one?
    There is a way to recreate an EBS user and preserve the historical references.I believe there is no such a way to find all records/tables with the old user_id. Even if you find the list and update them manually, I believe this approach is not supported.
    Please log a SR to confirm the same with Oracle support.
    Thanks,
    Hussein

  • AR Historical Data Upload

    Gurus,
      A real quick question:
    A new implementation, ECC 6.0, struggling with the historical data upload for AR. How? For me to upload history and link it to the customer, i need to upload it thru the Customer and hence the GL account (reconciliation account) and the GL account needs to be configured as a Non-Recon. account. But, when i create the customer and try to link it with a non-recon. account, it is not letting me save the master data.
       Please advice on the AR historical data upload urgently. thank you
    Achhar.

    Why would you need to have a non-rec account assigned in the customer?
    The normal posting would be, depending on your items:
    DR Customer *
    CR Load Account (whatever the name is)
    The customer would update the assigned recon account.
    Regards
    Hein

  • Historical data in SIS Reports

    Hi,
    We have just implemented SIS in our company last week and have been successfully able to fetch data from the date we actually went LIVE. However, as per the user requirement, they would also be interested in seeing the data for the past couple of years. Can somebody please let me know whether there is any way we can get the historicxal data in my MCSI report.
    Waiting for your reply at the earliest
    Thanks & Regards,
    Gaurav

    dear gaurav,
    yeah it is very much possible to have ur historical data.
    all u need to do is
    got to tcodes OLI7 for updating orders, OLI8 for delievries and OLI9 for billing documents.
    system updates all the data in a different version "&("
    after checking th correctness of the data in this version u can copy this version to main data version which is version 000 which can be done thru tcode OLIX
    u can also see more on this at following path.
    sprolOLISlodistics data warehousedata basis-toolssetup statistical data..
    read the help given at these application ...
    regards
    suresh s
    award points if it helps..

  • Upload PR/PO Historical data

    Hi
    How to upload the SAP PR/PO historical data from one system to another SAP system
    I have all relevant data table down loads from old system. Now I need to upload this PR/PO historical data into New SAP system
    there is no link between Old SAP system and New SAP system, both are owned by different orgnizations/owners
    Immediate replies will be appreciated
    Thanks & Regards
    Raju

    Hi Raju,
    Loading of PO's with transactions is problematic, since the follow-on documents can't be loaded. If it is completely closed PO and you load it to the new system, you shall block it - otherwise it will be considered in SAP.
    If you load a PO with GR > IR, the GR's can't be loaded, which means the invoice posted against this PO will remain unbalanced. You can of course close the PO by using MR11, but I think it is much easier just to skip this PO and register the coming invoice directly in FI.
    If you load a PO with IR > GR, the IR's can be loaded, which means that the coming GR will make the PO remain unbalanced. So here I would suggest skipping this order too and registering the coming GR without reference to the PO (mvt 501).
    In all my projects we only migrated completely "naked" PO's and besides, tried to minimise the number of such orders - let the buyers purchase more and complete the orders before the cut-over date - they can do it. In such case the number of open orders will normally be that low that it won't justify automatic migration. On the other hand, enetering the orders in the new system manually will be a good training for the buyers.
    If the reason for loading the closed PO's is having purchasing statisics in the new system, then this can be done very easily by updating the corresponding LIS tables directly (S011, S012). Another tip is to load the data both to the version 000 (actual data) and to some other version in these tables, so that you will always be able to copy them to 000 again whenever you need to reload LIS for some reason.
    BR
    Raf
    Edited by: Rafael Zaragatzky on Jun 3, 2011 8:28 PM

  • Inventory 0IC_C03, issue with historical data (data before Stock initializa

    Hi Experts,
    Inventory Management implementation we followed as bellow.
    Initailization data and delta records data is showing correctly with ECC MB5B data, but historical data (2007 and 2008 till date before initialization) data is not showing correctly (stock on ECC side) but showing only difference Qunatity from Stock initialization data to date of Query.
    we have done all the initial setting at BF11, Process keys and filed setup table for BX abd BF datasources, we are not using UM datasource.
    1 we loaded BX data and compressed request (without tick mark at "No Marker Update)
    2. initialization BF data and compressed request (with tick mark at "No Marker Update)
    3 for deltas we are comperessing request on daily (without tick mark at "No Marker Update).
    is this correct process
    in as you mentioned for BX no need to compress ( should not compress BX request ? )
    and do we need to compress delta requets ?
    we have issue for historial data validation,
    here is the example:
    we have initilaized on may 5th 2009.
    we have loaded BX data from 2007 (historical data)
    for data when we see the data on january 1st 2007, on BI side it is showing value in negative sign.
    on ECC it is showing different value.
    for example ECC Stock on january 1st 2007 : 1500 KG
    stock on Initialization may 5th 2009 : 2200 KG
    on BI side it is showing as: - 700 KG
    2200 - (-700) = 1500 ,
    but on BI side it is not showing as 1500 KG.
    (it is showing values in negative with refence to initialization stock)
    can you please tell, is this the process is correct, or we did worng in data loading.
    in validity table (L table) 2 records are there with SID values 0 and -1, is this correct
    thanks in advance.
    Regards,
    Daya Sagar
    Edited by: Daya Sagar on May 18, 2009 2:49 PM

    Hi Anil,
    Thanks for your reply.
    1. You have performed the initialization on 15th May 2009.
    yes
    2. For the data after the stock initialization, I believe that you have either performed a full load from BF data source for the data 16th May 2009 onwards or you have not loaded any data after 15th May 2009.
    for BF after stock initialization delta data, this compressed with marker update option unchecked.
    If this is the case, then I think you need to
    1. Load the data on 15th May (from BF data source) separately.
    do you mean BF ( Material movements) 15th May data to be compressed with No Marker Update option unchecked. which we do for BX datasource ?
    2. Compress it with the No Marker Update option unchecked.
    3. Check the report for data on 1st Jan 2007 after this. If this is correct, then all the history data will also be correct.
    After this you can perform a full load till date
    here till date means May 15 th not included ?
    for the data after stock initialization and then start the delta process. The data after the stock initialization(after 15th May 2009) should also be correct.
    can you please clarify these doubts?
    Thanks
    Edited by: Daya Sagar on May 20, 2009 10:20 AM

  • Best design for historical data

    I'm searching a way to design some historical data.
    Here my case:
    I have millions of rows defining a population, each on 20-30 different tables.
    Those rows are related to one master subject area, let say it is a person.
    The details table are defining status in a period of time (marital status, address, gender (yes that may change in some cases!, program, phase, status, support, healthcare, etc). They have all two attributes that define the time period (dateFrom, dateTo), and one or many attributes that define the status of the person (the measure).
    I know we need a weekly situation for this population since 1998 to yet.
    Problems are those:
    1) the population we will analyze will count some 20 different criteria's (measures), may be we will divide those in some different datamarts to avoid to much complexity
    2) we will drill down from year to week, and will accomplish comparison like year to year, year to previous year, year to date, etc at each level of time hierarchy (year, quarter, month, week).
    The question is :
    1) do we need to transform our data at a week level for each person to determine the status at this level (the data may be updated and will cause this transformation to be refreshed) ?
    1) do we need to aggregate for each level because mixed situation exists due to fact that more than one status may exists for same person at a given time period (more than one status may exists in march by example, how to interpret it ?), that will cause to some logic to be applied, isn't ?
    We will be glad to hear some recommendation/idea about this !
    Thank You

    I would try to get some exact user requirements, with the dataset you have described there are millions of combinations of answers that can be defined and it will be difficult for you to fulfill them all in one place, especially if there are slowly changing dimensions involved.
    The user requirements will determine what levels you transform the data at.

  • Historical data transfer in HRMS

    Hi,
    I need the best possible ways of transferring the historical data to HRM. I have studied ADE and web adi do any of the guys know how to use.
    Regard,
    Shahzad

    Migrating historical data within Oracle HRMS is not the easiest task in the world. I have experiemented using web ADI because this makes us of the API calls.
    In my experience i would suggest only using web ADI for small amounts of data in non complex scenarios. If you are considering migrating a full set of historical data i would suggest a more technical approach.
    A combination of SQL Loader to stage data before PL/SQL procedures make calls to the API to insert / update date etc.
    As cdunaway mentions it may really depend on if your HRMS environment is live, the PL/SQL procedure route gives you so much more flexibility as you can apply logic to the load of data. Due to date track this can get messy.......Correction / Update / Insert / Insert Overried!!!!! Ouch

Maybe you are looking for

  • Upgraded: New Mac, Setting up iPhoto Library From Multiple Old Backups

    Over the years (Jaguar to Snow Leopard), I've had multiple Macs in different locations (work, home) to which I've loaded photos from cameras into multiple iPhoto libraries. As these Macs have died (G4s, G5s), or I've been forced to upgrade, I've made

  • Need help in work order save BADI and create BAPI

    hi expert Our issues are below list 1. When release and save the work order ,the standard BADI(WORKORDER_UPDATE) have not the parameters about service information . (for this issue , chunfai had give a solution ,we are testing it ) 2. When we create

  • U201CError updating table 372"

    HI, When I am generating a excise invoice the system triggers u201CError updating table 372" I have not maintained of table 372 in access seq (Cd type (JEXP) . Previous i have maintained this table in my access seq and its working fine but I am not a

  • How do I change the page order in Numbers

    I have a simple three page document/ In the top left it says "sheet 1, 3 pages". I want to change the order of page I and 2. Simple, one would think, but I cannot figure out how to do it without a tricky cut/paste operation. Surely one would think yo

  • How does OOB breadcrumbs work for multisite environment

    Hi I am currently using CatalogNavHistory and CatalogNavHistoryCollector to display breadcrumbs in my catalog pages. My doubt is with ATG10.1 we are using multisite environment for our project. So one category will be part of more than one site. So h