Correct historical data in Infocube

Hi,
We have a infocube which stores data on basis of Reporting division (EUROPE) and Subsidaries (Germany, France, Great Britain etc.).
Now i have a situation where in i need to add new Reporting division (GB) to this cube  so that users can see reports with Reporting Division (Europe)and its subsidaries and Reporting division (GB) with its subsidaries.
Can any one please help me on how can i correct historical data and data that will flow from now on so as to make this possible in this cube.
Help will be really appriciated with maximum points.

hi,
first take the backup of the cube content.
then chose the cube, right click additional functions - remodelling.
then it opts for remodel rule name give the name and execute and then select it choose whether you have to add/delete/modify the char/keyfigure.
then if you add char/kf specify how/from where the historical datd is to come.
then execute the rule.
http://help.sap.com/saphelp_nw04s/helpdata/en/a4/1be541f321c717e10000000a155106/frameset.htm
Ramesh
Edited by: ramesh kumar on Mar 18, 2008 6:05 AM

Similar Messages

  • Copying historical data into planinng area

    Hai,
            "For performance reasons, SAP recommends that you copy historical data from infocube to a timeseries instead of reading it directly from infocube."
    Please correct me if I am wrong,
    timeseries in the above context is the planning area.
    I create a generic export datasource from MSDP_ADMIN giving it a name "9aplan" and it should generate an infsource in RSA1> infosources> Unassigned nodes as 9aplan.
    From here on, how can I load the history from the infocube lets say, sales, to the infosource?
    Should I create a transactional(realtime) infocube and then load data to it from both SALES and 9aplan?
    Then how can the system access this cube when planning? where is it specified?
    Thank you.

    Hi Vishu,
    A Planning Area is the structure - you need to initialise the "timeseries" against a planning version which will allow the data in keyfigures to be stored.
    Loading data from the infocube to planning area - timeseries does not require any BW structure. Just use transaction /SAPAPO/TSCUBE to copy the Sales history data from the cube to the required keyfigure in the planning area (for the given version - most often being 000).
    Hope this answers your question.
    Thanks,
    Somnath

  • Reloading Historical Data into Cube: 0IC_C03...

    Dear All,
    I'm working on SAP BW 7.3 and I have five years of historical data being loaded into InfoCube: 0IC_C03 (Material Stocks/Movements (as of 3.0B)) and the deltas are also scheduled (DataSources: 2LIS_03_BF & 2LIS_03_UM) for the InfoCube. I have new business requirement to reload the entire historical data into InfoCube with addition of "Fiscal Quarter" and I do not have a DSO in the data flow. I know its a bit tricky InfoCube (Inventory Management) to work on and reloading the entire historical data can be challenging.
    1. How to approach this task, what steps should I take?
    2. Drop/Delete entire data from InfoCube and then delete Setup Tables and refill these and then run "Repair Full Load", is this they way?
    3. What key points should I keep in mind, as different BO reports are already running and using this InfoCube?
    4. Should I run "Repair Full Load" requests for all DataSources( 2LIS_03_BX &2LIS_03_BF & 2LIS_03_UM) ?
    I will appreciate any input.
    Many thanks!
    Tariq Ashraf

    Hi Tariq,
    Unfortunately, you will need some downtime to execute a stock initialization in a live production system. Otherwise you cannot guarantee the data integrity. There are however certainly ways to minimize it. Please see SAP Note 753654 - How can downtime be reduced for setup table update. You can dramatically reduce the downtime to distinguish between closed and open periods. The closed periods can be updated retrospectively.
    To make it more concrete, you could consider e.g. the periods prior to 2014 as "closed". You can then do the initialization starting from January 1, 2014. This will save you a lot of downtime. Closed periods can be uploaded retrospectively and do not require downtime.
    Re. the marker update,please have a look at the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x, step 10, 11, 12 and 13 contain important information.
    Re. the steps, it looks OK for me but please double check with the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x.
    Please have a look at the following document for more background information around inventory management scenarios:
    How to Handle Inventory Management Scenarios in BW (NW2004)
    Last but not least, you might want to have a look at the following SAP Notes:
    SAP Note 436393 - Performance improvement for filling the setup tables;
    SAP Note 602260 - Procedure for reconstructing data for BW.
    Best regards,
    Sander

  • MP31 - no historical data - how to generate?

    Hi specialists,
    Mi problem: I have several materials, for which I cannot execute a forecast, because there is "no historical data". The material is very old and there was for sure consumption on it in the past, but before this material was never under MRP.
    Is there a way to generate this historical data automatically, maybe even in mass? or do I really have to do that manually material by material?
    Any help is appreciated.
    Tnx a lot
    Kurt

    Hi Kurt,
    Historical consumption data gathering in the system is not dependent on the material being MRP or not, but on the definitions per movement type, and maintenance of period in one of the plant level views of material master.
    I don't know of any procedure in standard SAP that will "rebuild" consumption data. Neverheless, you can always enter your "corrected" historical data, and it will be used by the system for forecasting.
    You can do it manually for each material, or upload it from an external file.

  • How to fill a new single field in a Infocube with historical data

    Hello Everybody,
    We have an SD infocube with historical data since 1997.
    Some of the infoobjects (fields) of the infocube were empty during all this time.
    Now we require to fill a single field of the infocube with historical data from R/3.
    We were thinking that an option could be to upload data from the PSA in order to fill only the required field (infoobject).
    Is it possible? is there any problem doing an upload from the PSA requests directly to the infocube.
    Some people of our team are thinking that the data may be duplicated... are they right?
    Which other solutions can we adopt to solve this issue?
    We will appreciate all your valuable help.
    Thanks in advance.
    Regards.
    Julio Cordero.

    Remodeling in BI 7:
    /people/mallikarjuna.reddy7/blog/2007/02/06/remodeling-in-nw-bi-2004s
    http://www.bridgeport.edu/sed/projects/cs597/Fall_2003/vijaykse/step_by_step.htm
    Hope it helps..

  • Remote historical data is not retrieved completely viewing it in MAX4

    Hi,
    since I installed LabVIEW 8 I have some problems retrieving historical data from another computer. Sometimes not all data is retrieved (if I zoom in or out or move back in time) and this missing data won't be retrieved ever.
    I already deleted the Citadel cache once, but after this even less data was retrieved... What's really weird, is, that for channels which weren't retrieved correctly, the data gets not updated anymore!
    On the remote computer I have a LabVIEW DSC Runtime 7.1 running (MAX 3.1.1.3003) on my local computer MAX 4.0.0.3010 and LabVIEW 8 DSC (development system) is installed parallel to LV DSC 7.1.1 (dev system). LV 8 is installed for testing purposes (I doubt we'll switch soon) and overall I like MAX 4. The HyperTrend.dll on my local computer is version  3.2.1017.
    This is really a quite annoying bug!
    So long,
        Carsten
    Message Edited by cs42 on 02-02-2006 09:18 AM

    Hi,
    > We've been unable to reproduce this issue. If you could provide some additional information, it might help us out.
    I did fear this, as even on my computer it is happening just sometimes...
    > 1) How many traces are you viewing?
    The views I observed this in had 2 to 13 traces.
    > 2) How often are the traces being updated?
    For some it's pretty often (about once a second), for some it's very infrequent (no change in data, that means updated because of max time between logs). I more often see this for traces that are updated very infrequently. But I think I've seen this for frequent traces as well (for these it does work currently).
    > 3) Are the traces being updated by a tag value change, or by the "maximum time between logs" setting in the engine?
    It happened for both types.
    > 4) What is the frequency of the "maximum time between logs" setting?
    Max time between logs is 10 minutes.
    > 5) Is the Hypertrend running in live mode when you zoom out/pan?
    I think it happened in both modes, but it defenitely did in live mode.
    > 6) If you disable/re-enable live mode in the Hypertrend, does the data re-appear?
    I couldn't trigger the loading of the data. All I did is wait and work with MAX (zooming, panning, looking at data) and after quite a while (some hours), the data appeared.
    Just tested this on a view where data is missing (for some days now!), and it didn't trigger data reloading. Zooming and panning don't as well. There's a gap of up to 3 days now for some traces. 7 of the 13 traces of this view are incompletely shown. All stopping at the same time but reappearing at different ones.
    AFAIR from the laboratory computer (these are temperatures and it's very plausable that these didn't change), there wasn't any change in these traces so they all got logged because of max time...
    I just created a new view and added these traces: the gap is there as well.
    (Sorry to put this all in this entry even if it is related to you other questions, but I started this live test with disable/re-enable live mode. )
    > 7)
    Are the clocks on the client and server computers synchronized? If not
    synchronized, how far apart are the times on the two computers?
    They should be (Windows 2000 Domain synchronized to ADS), but are 5 seconds apart.
    One thing I remember now: I have installed DIAdem 10 beta 2 (10.0.0b2530, USI + DataFinder 1.3.0.2526). There I had (and reported) some problems with data loading from a Citadel Database of a remote machine as well. This was accounted to some cache problem. Maybe a component is interfering?
    Thanks for investigating.
    Cheers,
        Carsten

  • How to populate historical data in the appraisal

    Hello,
    I am working in Oracle PMS module. Our requirement is to display historical data of an employee in the next appraisal cycle.
    Eg. If there are 3 appraisal cycle in a year. In the first appraisal manager will add objectives and competencies for an employee. lets say he added competency 1 ,competency 2,Objective 1 and Objective 2. after completion of the first cycle.
    When manager initiates the second appraisal. in the objectives block 2 values (Objective 1 and Objective 2) should be auto populated. Same as in competency block 2 values (competency 1 and competency 2) should be auto populated.
    Here manager can add more objectives or competencies.
    Please suggest how this can be achived.
    Thanks in advance,
    sheetal

    Hi Shunhui
    Answers :-
    If the required fields are not available in the Maintenance link, does it mean that I have to goto SE11 and create an append structure for the extract structure MC02M_0ITM which belongs to the datasource 2LIS_02_ITM?
    Then I have to write codes in the user exit for transactional datasources in RSAP0001?
    Ans : That's correct . Once you have done this on R/3 side, replicate the data source in BW side. Activate the transfer rules with appropriate mapping for 3 new fields and also adjust the update rules so that these 3 new fields are availble in Info providers.
    Are you able to provide some information on the errors that might arise during transports with the delta being active in the production system? Will data be corrupted? Will I need to clear the delta queue before the transport goes over?
    Ans : I assume that deltas are active in Production system . There is risk to your active delta into BW .
    Before your R/3 transports reach Production system , you need to clear the delta queue that means bring the number of LUWs for 2LIS_02_ITM to zero . Also stop the delta collector job in R/3 for Application 02(purchasing). Make sure that there are no postings (transactions /users are locked) so that there will be change in purchsing base tables.
    Then send your R/3 transport into Production system (Append structure , new fields + user exit code) .Replicate on BW side. Now send the BW transports to BW production system .
    I had done this earlier for 2LIS_12_VCITM and had a step by step proceduew written with me ....will have to search for that document . Let me know your email ID , will try to send it once I find the document
    Regards
    Pradip
    (when you edit the questions , there are option buttons with which you can assign points for that you need to do log on to SDN )

  • Historical data in SIS Reports

    Hi,
    We have just implemented SIS in our company last week and have been successfully able to fetch data from the date we actually went LIVE. However, as per the user requirement, they would also be interested in seeing the data for the past couple of years. Can somebody please let me know whether there is any way we can get the historicxal data in my MCSI report.
    Waiting for your reply at the earliest
    Thanks & Regards,
    Gaurav

    dear gaurav,
    yeah it is very much possible to have ur historical data.
    all u need to do is
    got to tcodes OLI7 for updating orders, OLI8 for delievries and OLI9 for billing documents.
    system updates all the data in a different version "&("
    after checking th correctness of the data in this version u can copy this version to main data version which is version 000 which can be done thru tcode OLIX
    u can also see more on this at following path.
    sprolOLISlodistics data warehousedata basis-toolssetup statistical data..
    read the help given at these application ...
    regards
    suresh s
    award points if it helps..

  • Inventory 0IC_C03, issue with historical data (data before Stock initializa

    Hi Experts,
    Inventory Management implementation we followed as bellow.
    Initailization data and delta records data is showing correctly with ECC MB5B data, but historical data (2007 and 2008 till date before initialization) data is not showing correctly (stock on ECC side) but showing only difference Qunatity from Stock initialization data to date of Query.
    we have done all the initial setting at BF11, Process keys and filed setup table for BX abd BF datasources, we are not using UM datasource.
    1 we loaded BX data and compressed request (without tick mark at "No Marker Update)
    2. initialization BF data and compressed request (with tick mark at "No Marker Update)
    3 for deltas we are comperessing request on daily (without tick mark at "No Marker Update).
    is this correct process
    in as you mentioned for BX no need to compress ( should not compress BX request ? )
    and do we need to compress delta requets ?
    we have issue for historial data validation,
    here is the example:
    we have initilaized on may 5th 2009.
    we have loaded BX data from 2007 (historical data)
    for data when we see the data on january 1st 2007, on BI side it is showing value in negative sign.
    on ECC it is showing different value.
    for example ECC Stock on january 1st 2007 : 1500 KG
    stock on Initialization may 5th 2009 : 2200 KG
    on BI side it is showing as: - 700 KG
    2200 - (-700) = 1500 ,
    but on BI side it is not showing as 1500 KG.
    (it is showing values in negative with refence to initialization stock)
    can you please tell, is this the process is correct, or we did worng in data loading.
    in validity table (L table) 2 records are there with SID values 0 and -1, is this correct
    thanks in advance.
    Regards,
    Daya Sagar
    Edited by: Daya Sagar on May 18, 2009 2:49 PM

    Hi Anil,
    Thanks for your reply.
    1. You have performed the initialization on 15th May 2009.
    yes
    2. For the data after the stock initialization, I believe that you have either performed a full load from BF data source for the data 16th May 2009 onwards or you have not loaded any data after 15th May 2009.
    for BF after stock initialization delta data, this compressed with marker update option unchecked.
    If this is the case, then I think you need to
    1. Load the data on 15th May (from BF data source) separately.
    do you mean BF ( Material movements) 15th May data to be compressed with No Marker Update option unchecked. which we do for BX datasource ?
    2. Compress it with the No Marker Update option unchecked.
    3. Check the report for data on 1st Jan 2007 after this. If this is correct, then all the history data will also be correct.
    After this you can perform a full load till date
    here till date means May 15 th not included ?
    for the data after stock initialization and then start the delta process. The data after the stock initialization(after 15th May 2009) should also be correct.
    can you please clarify these doubts?
    Thanks
    Edited by: Daya Sagar on May 20, 2009 10:20 AM

  • How to extract historical data from  business objects datamarts to Sap BW.

    Hi Guys,
    I had a scenario,where i need to extract or convert historical data which is resided in datamarts of businesobjects into SAP BI Infocubes.
    Can anyone have experienced or idea on this type of scenario.
    If you experience or idea plz guide me. If you know any documentation or Docs where i can follow to workout this task.please refer
    Thanks,
    Vikram B

    Hello Vikram,
    I recommend to post this query to the [Integration Kits - SAP|BusinessObjects Integration Kits; forum.
    This forum is dedicated to topics related to the BusinessObjects Integration Kit for SAP.
    It is monitored by qualified technicians and you will get a faster response there.
    Also, all SAP Kit queries remain in one place and thus can be easily searched in one place.
    Thanks a lot,
    Falk

  • Historical Data Maintenance

    Dear Members,
    This is my second post in the forum. Let me explain the scenario first,
    "We have 2 Tools -Tool1 and Tool2 which points to 2 different databases - db1 and db2 respectively. currently, the db1 performance is very poor due to huge data. we want to have only latest 18 months data in db1. The oldest data beyond 18 months should remain in db2 (in read only mode)which Tool 2 connects to. So, whenever i need historical data, i`ll use tool2. At regular intervals the data from db1 should move to db2."
    My idea is to use partitioning and logical standby. At the end of each month, the oldest one month data will be moved to db2. But please let me know whether this will be feasible to the above concept. If so, how to implement this and if not, what would be the right solution for this?
    Regards,
    Mani
    TCS

    Partitioning is great on the source side (assuming you partition by date, of course).
    I am not sure how logical standby would help on the destination. The point of logical standby is to keep the standby database up to date with the primary, so the standby database would not be read only, it would be constantly applying transactions from the primary. And when you drop a partition on the primary, you would drop the partition on the standby, so the standby wouldn't maintain history.
    Instead of logical standby, you could use Streams to replicate transactions and configure Streams to ignore certain DDL operations like partition drops. That would allow you to retain history on db2 but wouldn't give you a read-only db2 database.
    You could potentially do partition exchange in db1 at a regular interval, moving the data you want to remove into a non-partitioned staging table, move that table to db2 (via export/import, transportable tablespaces, etc), and do a partition exchange to load the data into the partitioned table on db2. That gives you a read only db2 and lets you retain history, but requires some work to move the data around every month.
    Of course, if you decide to partition db1, assuming you did it correctly, I would tend to expect that the performance problems would go away (or at least that archiving the old data wouldn't affect performance any longer). One of the points of partitioning is that Oracle can then do partition elimination for your queries so that it only needs to look at the current partition if that's all tool1 is interested in. So perhaps all you need to do is partition db1 and you don't need db2 at all.
    Justin

  • Historical data transfer in HRMS

    Hi,
    I need the best possible ways of transferring the historical data to HRM. I have studied ADE and web adi do any of the guys know how to use.
    Regard,
    Shahzad

    Migrating historical data within Oracle HRMS is not the easiest task in the world. I have experiemented using web ADI because this makes us of the API calls.
    In my experience i would suggest only using web ADI for small amounts of data in non complex scenarios. If you are considering migrating a full set of historical data i would suggest a more technical approach.
    A combination of SQL Loader to stage data before PL/SQL procedures make calls to the API to insert / update date etc.
    As cdunaway mentions it may really depend on if your HRMS environment is live, the PL/SQL procedure route gives you so much more flexibility as you can apply logic to the load of data. Due to date track this can get messy.......Correction / Update / Insert / Insert Overried!!!!! Ouch

  • Cisco ANM - Exporting Historical Data (VA)

    Hi Experts,
    I am looking for the way to access historical data on Cisco ANM 5.2.1 Virtual Appliance. In documentation I found that raw data should be stored in
    /var/lib/anm/export/historical-data/date-stamp.
    The problem is that VA is lockdown environment with no access to its content.
    Does anyone found the way how to use external scripts to gather historical data in CSV format?
    BR
    Marcin

    This exception is sometimes seen when the 4.0(5) server had been upgraded from 4.0(3) or a lower version originally.  The missing information in DCD does not cause any issues on 4.0, but it causes an issue for the DMT.
    Attached is a document that is often helpful to replace this file and correct the problem.  The blob value that needs to be copied into the entry in DCD is contained within the doc.
    Thanks,
    Brendan

  • Business partner TAX number historical data

    Hi!
      One of tax number (RU3) is changed for business partner when address is changed.
      I need historical date for tax numbers  to be able to prind output docs with correct tax numbers on delivery date.
      We use time dependent data for central data but000, bank data, address data but I have not found how to enter time period for tax numbers.
    Any ideas?
    Andrey Garshin.

    Hi flyboy,
    Follow this steps.
    1. Go To Administration / Setup / Financials / Tax / Tax Code Determination / Tax Code Determination screen will apear / In Key Fields 1 select Business Partner / Leave the Key Fields 2 & Key Fields 3
    2. Dubble Click on Priority fiels Serial No / Select Business Partner /  Dubble Click on Serial Nuber Field / Here enter the Effective Date & Tax Code
    I think this will help you
    Thanks,
    Srujal Patel

  • Exporting historical data to text file with MAX misses columns of data?

    I am using Labview 7.1 with DSC module 7.1 and wants to export data to either excel format or .txt.
    I have tried this with the historical data export in MAX, and also programmatically with the "write traces to spreadsheet file.vi" available in the DSC module. All my tags in my tag engine file (*.scf) are defined to log data and events. Both the update tag engine deadband and the update database are set to 0 %.
    My exported excel or text file seems reasonalbe except that some columns of data are missing. I dont understand
    why data from these tags are not in the exported files since they have the same setup in the (.scf) as other tags which are okay exported?
    All defined tags can be seen using the NI hypertrend or MAX. Also the ones that are not correctly exported to file.
    Appreciate comments on this.
    Best regards,
    Ingvald Bardsen

    I am using LV and DSC 7.1 with PCI-6251 and max 4.2. In fact, just one column of values does not make sense. In attachment, follows the excel exported file. The last of column called ...V-002 that is the problem. I put probes for checking values but them shows correct. When the file is exported that to show values wrong.
    The problem of missing values in column i solved putting 0% in field of deadband.
    thank you for your help
    Attachments:
    qui, 2 de ago de 2007 - 132736.xls ‏21 KB

Maybe you are looking for

  • Outlook 2007/2010 hangs on Windows 7

    Hi all, We are all of suddenn facing issue where Outlook hangs/freezes or stop responding while accessing the pst files over the network. The machines are running on Windows 7 with either Outllok 2007/2010. Exchange server version used is 2003 Sp2 ru

  • Difference with embedded CSS vs loading runtime CSS

    Hi, Move from embedded CSS to runtime CSS, but having some subtle differences. note the numeric stepper handles have default colors, this is with embedded css and what the client wants This one is runtime css note the handles are black. I had this in

  • Calling an https web service fro Oracle 9i database

    I have seen in OTN as well metalink how to call an external web service From inside PL/SQL functions. Examples use http URL. My requirements are however to call a web service using https. Is it possible to do that in Oracle9i. If yes, how? Any pointe

  • Illustrator CC 2014 hos stopped working

    I'm just starting Illustrator CC 2014 but application stopped message this below. "Illustrator CC 2014 hos stopped working" Faulting application name: Illustrator.exe, version: 18.1.0.430, time stamp: 0x54247b91 Faulting module name: MSVCR110.dll, ve

  • Same Licensee, new hard drive, cannot get into license, what do I do?

    I am administrator, licensed user hard drive died, new computer will not recognize original invitation to license and say license is used. How do I reinvite her or do I have to cancel license and add new license?