Historical data move.

Hi,
In my database i have various main transactional tables which contains 8 to 9 years of old data but my users hardly fetch data which is older than 2 years. Some time this much data in my tables impact the application performance. Now i want to move historical data out of my main transactional table to keep them as smaller as possible. I have written few following options to do this. Could you please explain me which options is better and why. specifically what's the advantage if i keep my historical data in different tablespace?
1. Create history tables in same schema and keep respected table data older than 2 years in same table space.
2. Create separate history schema and keep respected table data older than 2 years in different tablesapce
3. dont move the data create range partitions according to date.
Regards,
JM

JM_1979 wrote:
Hi,
In my database i have various main transactional tables which contains 8 to 9 years of old data but my users hardly fetch data which is older than 2 years. Some time this much data in my tables impact the application performance. Now i want to move historical data out of my main transactional table to keep them as smaller as possible. I have written few following options to do this. Could you please explain me which options is better and why. specifically what's the advantage if i keep my historical data in different tablespace?
1. Create history tables in same schema and keep respected table data older than 2 years in same table space.
2. Create separate history schema and keep respected table data older than 2 years in different tablesapce
3. dont move the data create range partitions according to date.
Regards,
JMYour first two options seem to imply that you think there is some inherent correlation between schemas and tablespaces, or that one choice might have better performance than another. Not so.
If you create a separate table for the historical data, it's a separate table. Period. As far as performance goes, it doesn't matter if it is in a separate schema or not. It doesn't matter if it is in a separate tablespace or not. The simple fact that you have moved it to a separate table (removing it from the table with the 'current' data) means that queries on the 'current' table won't have to wade through the historical data.
But even before reading your proposed solutions, I was think of option 3. Your situation is exactly what partitioning is most often used for. The beauty of partitioning is that oracle can figure out that it doesn't have to wade through the 'historical' partitions if it can tell from the SELECT predicates that it doesn't need what's there, PLUS if you do need the historical data, you don't have to write a join of the two tables. In short, you app doesn't have to be concerned with what's historical vs. what's current.
Take note that partitioning is an extra cost option.

Similar Messages

  • HOW TO TRANSFER HISTORICAL DATA FROM ONE ACCOUNT TO ANOTHER

    제품 : FIN_GL
    작성날짜 : 2006-05-29
    HOW TO TRANSFER HISTORICAL DATA FROM ONE ACCOUNT TO ANOTHER
    =============================================================
    PURPOSE
    특정 기간의 Balance 를 Account 별로 Transfer 하는 방법에 대해 알아 보도록 한다.
    Explanation
    GL 의 Mass Maintenance 기능을 이용하면 한 Account 에서 다른 Account 로 혹은 Multiple Account 에서 다른 하나의 Account 로 Balance 를 이동 시킬 수 있다.
    1. GL Responsibility 에서 Other> Mass Maintenance 를 선택한다.
    2. Move/Merge 작업을 위한 Request Name 과 Description을 입력한다.
    3. Request Type 으로 Move 혹은 Merge 를 선택한다.
    4. source-to-target account 를 위해 line number 를 입력한다.
    5. LOV 에서 source Account 를 선택하여 입력한다.
    모든 Account 는 enable 상태여야 한다.
    6. target account 역시 LOV에서 선택하여 입력한다.
    7. 지정한 작업을 수행 하기 전에 먼저 확인 작업을 할 수도 있다.
    8. 작업한 내용을 저장한다.
    9. Move/Merge request 를 수행한다.
    Example
    N/A
    Reference Documents
    Note. 146050.1 - How to Transfer Historical Data from One Account to Another

    Follow the directions here:
    http://support.apple.com/kb/HT2109

  • Remote historical data is not retrieved completely viewing it in MAX4

    Hi,
    since I installed LabVIEW 8 I have some problems retrieving historical data from another computer. Sometimes not all data is retrieved (if I zoom in or out or move back in time) and this missing data won't be retrieved ever.
    I already deleted the Citadel cache once, but after this even less data was retrieved... What's really weird, is, that for channels which weren't retrieved correctly, the data gets not updated anymore!
    On the remote computer I have a LabVIEW DSC Runtime 7.1 running (MAX 3.1.1.3003) on my local computer MAX 4.0.0.3010 and LabVIEW 8 DSC (development system) is installed parallel to LV DSC 7.1.1 (dev system). LV 8 is installed for testing purposes (I doubt we'll switch soon) and overall I like MAX 4. The HyperTrend.dll on my local computer is version  3.2.1017.
    This is really a quite annoying bug!
    So long,
        Carsten
    Message Edited by cs42 on 02-02-2006 09:18 AM

    Hi,
    > We've been unable to reproduce this issue. If you could provide some additional information, it might help us out.
    I did fear this, as even on my computer it is happening just sometimes...
    > 1) How many traces are you viewing?
    The views I observed this in had 2 to 13 traces.
    > 2) How often are the traces being updated?
    For some it's pretty often (about once a second), for some it's very infrequent (no change in data, that means updated because of max time between logs). I more often see this for traces that are updated very infrequently. But I think I've seen this for frequent traces as well (for these it does work currently).
    > 3) Are the traces being updated by a tag value change, or by the "maximum time between logs" setting in the engine?
    It happened for both types.
    > 4) What is the frequency of the "maximum time between logs" setting?
    Max time between logs is 10 minutes.
    > 5) Is the Hypertrend running in live mode when you zoom out/pan?
    I think it happened in both modes, but it defenitely did in live mode.
    > 6) If you disable/re-enable live mode in the Hypertrend, does the data re-appear?
    I couldn't trigger the loading of the data. All I did is wait and work with MAX (zooming, panning, looking at data) and after quite a while (some hours), the data appeared.
    Just tested this on a view where data is missing (for some days now!), and it didn't trigger data reloading. Zooming and panning don't as well. There's a gap of up to 3 days now for some traces. 7 of the 13 traces of this view are incompletely shown. All stopping at the same time but reappearing at different ones.
    AFAIR from the laboratory computer (these are temperatures and it's very plausable that these didn't change), there wasn't any change in these traces so they all got logged because of max time...
    I just created a new view and added these traces: the gap is there as well.
    (Sorry to put this all in this entry even if it is related to you other questions, but I started this live test with disable/re-enable live mode. )
    > 7)
    Are the clocks on the client and server computers synchronized? If not
    synchronized, how far apart are the times on the two computers?
    They should be (Windows 2000 Domain synchronized to ADS), but are 5 seconds apart.
    One thing I remember now: I have installed DIAdem 10 beta 2 (10.0.0b2530, USI + DataFinder 1.3.0.2526). There I had (and reported) some problems with data loading from a Citadel Database of a remote machine as well. This was accounted to some cache problem. Maybe a component is interfering?
    Thanks for investigating.
    Cheers,
        Carsten

  • Reloading Historical Data into Cube: 0IC_C03...

    Dear All,
    I'm working on SAP BW 7.3 and I have five years of historical data being loaded into InfoCube: 0IC_C03 (Material Stocks/Movements (as of 3.0B)) and the deltas are also scheduled (DataSources: 2LIS_03_BF & 2LIS_03_UM) for the InfoCube. I have new business requirement to reload the entire historical data into InfoCube with addition of "Fiscal Quarter" and I do not have a DSO in the data flow. I know its a bit tricky InfoCube (Inventory Management) to work on and reloading the entire historical data can be challenging.
    1. How to approach this task, what steps should I take?
    2. Drop/Delete entire data from InfoCube and then delete Setup Tables and refill these and then run "Repair Full Load", is this they way?
    3. What key points should I keep in mind, as different BO reports are already running and using this InfoCube?
    4. Should I run "Repair Full Load" requests for all DataSources( 2LIS_03_BX &2LIS_03_BF & 2LIS_03_UM) ?
    I will appreciate any input.
    Many thanks!
    Tariq Ashraf

    Hi Tariq,
    Unfortunately, you will need some downtime to execute a stock initialization in a live production system. Otherwise you cannot guarantee the data integrity. There are however certainly ways to minimize it. Please see SAP Note 753654 - How can downtime be reduced for setup table update. You can dramatically reduce the downtime to distinguish between closed and open periods. The closed periods can be updated retrospectively.
    To make it more concrete, you could consider e.g. the periods prior to 2014 as "closed". You can then do the initialization starting from January 1, 2014. This will save you a lot of downtime. Closed periods can be uploaded retrospectively and do not require downtime.
    Re. the marker update,please have a look at the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x, step 10, 11, 12 and 13 contain important information.
    Re. the steps, it looks OK for me but please double check with the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x.
    Please have a look at the following document for more background information around inventory management scenarios:
    How to Handle Inventory Management Scenarios in BW (NW2004)
    Last but not least, you might want to have a look at the following SAP Notes:
    SAP Note 436393 - Performance improvement for filling the setup tables;
    SAP Note 602260 - Procedure for reconstructing data for BW.
    Best regards,
    Sander

  • Managing historical data

    Hi all
    I have some historical data for some accounts(Share capital etc) for which I do no want to translate or calculate or do anything on it. I just want it to show up in my report and get consolidated.
    Do I proceed by creating a member list and identifying those accounts or is there any easier way to do so? Also, creating a static member list will be of no use since there might be more accounts later and everytime I have to add them to the list.
    Please let me know how to proceed.
    Regards,
    Ramith

    There are several different ways to do historical overrides (usually USD) depending on your metadata setup. One example would be to set up a new statistical account for each existing account you want to override. Then set up a new hierarchy in a custom dimension (typically the one with BS movements / Cash Flow / etc). In this hierarchy you can set up members such as HistUSDAdjs. You'll input the override amount in this new custom member in the new stat account. Then, you'll have some new rules which will copy the override amount to USD (and probably ending balance) in the Balance Sheet account. You'll also want to have a rule to copy the HistUSDAdj amount in the stat account over from the prior period so you only have to enter the override when it changes and not each month going forward.

  • Reloading Historical Material Movements

    I've recently reloaded Material Movements Infosource which populates the Cube 'Material Movements' as well as the ODS 'Material Documents'.  I did the update with only the latest data as the documentation said I could then go back afterwards and perform a full update for historical data (for example one month at a time).  The historical loads have worked great for the Cube however when looking at the ODS I can see that all of my historical loads have yet to be activated.  The problem is that when I go to activate them I get the error;
    "Full 57,714 has overlapping selections with init."
    "0000056305: No updates into ZMMO1"
    I was very careful not to have any overlapping selections and they are definitely not overlapping (and have loaded successfully into the cube) so I don't understand what the problem is.  I did notice in my documentation that, if loading to an ODS, I had to do a 'full repair' instead of a 'full update'.  But I don't see where the option is for 'full repair'.
    Any suggestions?
    Thanks!

    To load data into the cube, in RSA1 goto the InfoSource tab and search for an InfoSource called 8<i>YrODSName</i>. Here you can create an InfoPackage that will load the full/delta (as the case may be) to your cube. You can execute this just like any other load.
    FYI: The automatic laod to datamart setting can be found in ODS maintenance. Just double click your ODS and in the right hand pane open up the Settings > this will be the last one , below automatic activate.
    Hope this helps...

  • Inventory 0IC_C03, issue with historical data (data before Stock initializa

    Hi Experts,
    Inventory Management implementation we followed as bellow.
    Initailization data and delta records data is showing correctly with ECC MB5B data, but historical data (2007 and 2008 till date before initialization) data is not showing correctly (stock on ECC side) but showing only difference Qunatity from Stock initialization data to date of Query.
    we have done all the initial setting at BF11, Process keys and filed setup table for BX abd BF datasources, we are not using UM datasource.
    1 we loaded BX data and compressed request (without tick mark at "No Marker Update)
    2. initialization BF data and compressed request (with tick mark at "No Marker Update)
    3 for deltas we are comperessing request on daily (without tick mark at "No Marker Update).
    is this correct process
    in as you mentioned for BX no need to compress ( should not compress BX request ? )
    and do we need to compress delta requets ?
    we have issue for historial data validation,
    here is the example:
    we have initilaized on may 5th 2009.
    we have loaded BX data from 2007 (historical data)
    for data when we see the data on january 1st 2007, on BI side it is showing value in negative sign.
    on ECC it is showing different value.
    for example ECC Stock on january 1st 2007 : 1500 KG
    stock on Initialization may 5th 2009 : 2200 KG
    on BI side it is showing as: - 700 KG
    2200 - (-700) = 1500 ,
    but on BI side it is not showing as 1500 KG.
    (it is showing values in negative with refence to initialization stock)
    can you please tell, is this the process is correct, or we did worng in data loading.
    in validity table (L table) 2 records are there with SID values 0 and -1, is this correct
    thanks in advance.
    Regards,
    Daya Sagar
    Edited by: Daya Sagar on May 18, 2009 2:49 PM

    Hi Anil,
    Thanks for your reply.
    1. You have performed the initialization on 15th May 2009.
    yes
    2. For the data after the stock initialization, I believe that you have either performed a full load from BF data source for the data 16th May 2009 onwards or you have not loaded any data after 15th May 2009.
    for BF after stock initialization delta data, this compressed with marker update option unchecked.
    If this is the case, then I think you need to
    1. Load the data on 15th May (from BF data source) separately.
    do you mean BF ( Material movements) 15th May data to be compressed with No Marker Update option unchecked. which we do for BX datasource ?
    2. Compress it with the No Marker Update option unchecked.
    3. Check the report for data on 1st Jan 2007 after this. If this is correct, then all the history data will also be correct.
    After this you can perform a full load till date
    here till date means May 15 th not included ?
    for the data after stock initialization and then start the delta process. The data after the stock initialization(after 15th May 2009) should also be correct.
    can you please clarify these doubts?
    Thanks
    Edited by: Daya Sagar on May 20, 2009 10:20 AM

  • Historical Data Maintenance

    Dear Members,
    This is my second post in the forum. Let me explain the scenario first,
    "We have 2 Tools -Tool1 and Tool2 which points to 2 different databases - db1 and db2 respectively. currently, the db1 performance is very poor due to huge data. we want to have only latest 18 months data in db1. The oldest data beyond 18 months should remain in db2 (in read only mode)which Tool 2 connects to. So, whenever i need historical data, i`ll use tool2. At regular intervals the data from db1 should move to db2."
    My idea is to use partitioning and logical standby. At the end of each month, the oldest one month data will be moved to db2. But please let me know whether this will be feasible to the above concept. If so, how to implement this and if not, what would be the right solution for this?
    Regards,
    Mani
    TCS

    Partitioning is great on the source side (assuming you partition by date, of course).
    I am not sure how logical standby would help on the destination. The point of logical standby is to keep the standby database up to date with the primary, so the standby database would not be read only, it would be constantly applying transactions from the primary. And when you drop a partition on the primary, you would drop the partition on the standby, so the standby wouldn't maintain history.
    Instead of logical standby, you could use Streams to replicate transactions and configure Streams to ignore certain DDL operations like partition drops. That would allow you to retain history on db2 but wouldn't give you a read-only db2 database.
    You could potentially do partition exchange in db1 at a regular interval, moving the data you want to remove into a non-partitioned staging table, move that table to db2 (via export/import, transportable tablespaces, etc), and do a partition exchange to load the data into the partitioned table on db2. That gives you a read only db2 and lets you retain history, but requires some work to move the data around every month.
    Of course, if you decide to partition db1, assuming you did it correctly, I would tend to expect that the performance problems would go away (or at least that archiving the old data wouldn't affect performance any longer). One of the points of partitioning is that Oracle can then do partition elimination for your queries so that it only needs to look at the current partition if that's all tool1 is interested in. So perhaps all you need to do is partition db1 and you don't need db2 at all.
    Justin

  • Historical Data migration

    Hello Experts!
    Our client is switching from a SAP 4.7 to SAP ECC 6.0. We're going to move all open items (customers, vendors), the assets, master data...etc via LSMW.
    One question remains: What happens with the historical data? They are not willing to keep the old systems just for consulting data (this would generate a yearly fee to host the systems). The client has IXOS as archiving tool.
    My question is rather if it is possible to make a huge copy of all transactional data or not.
    thanks for the aswers.
    Regards
    FX

    Hi,
    Your situation is a bit unclear... If you are going to upgrade your SAP system, why do you have to migrate the data via LSMW?
    Regards,
    Eli

  • MP31 - no historical data - how to generate?

    Hi specialists,
    Mi problem: I have several materials, for which I cannot execute a forecast, because there is "no historical data". The material is very old and there was for sure consumption on it in the past, but before this material was never under MRP.
    Is there a way to generate this historical data automatically, maybe even in mass? or do I really have to do that manually material by material?
    Any help is appreciated.
    Tnx a lot
    Kurt

    Hi Kurt,
    Historical consumption data gathering in the system is not dependent on the material being MRP or not, but on the definitions per movement type, and maintenance of period in one of the plant level views of material master.
    I don't know of any procedure in standard SAP that will "rebuild" consumption data. Neverheless, you can always enter your "corrected" historical data, and it will be used by the system for forecasting.
    You can do it manually for each material, or upload it from an external file.

  • Reporting on historical data

    Hello All,
    We have all our vendor data in MS-SQL database.
    It contains historical data (vendors with whom we don't do business anymore) and also the active vendors with whom we are in business.
    We are moving the active vendor into our ERP data using LSMW.
    We are also planning to move the inactive vendor data into some new tables in ERP to store them.
    My question is , can we do reporting on this inactive vendor data from these tables ?
    Are reporting is possible only for the active transactional data?
    Please through some idea as I am a beginner and I need to know this very urgently.
    Thanks in advance,
    Veena.

    Hi Sam,
    I have the same query.
    With BW we can do this.But in our case we have BW implementation in Phase2 so we want to acheive this historical data reporting in Phase1 in ERP.I want to know if we can report on these ztables where we store the historical data .
    It a bit urgent can you please let me know if it is possible.
    Thanks in advance,
    VS.

  • Historical data steps

    Hi,
    How to loading the historical data steps.

    Hi,
    See the below steps for 0IC_C03, it tells about historical data, while filling teh setup tables it supprts the Date range, but in case of SD it will support the Doc no, so you need to follow the same as 0IC_C03 loads but in SD you need to give Docno while filliwng setup tables and then Init load after that fill setup tables with Historical loads and then do full updates in BW once you complete all tehn run V3 and run normal Delta.
    Treatment of historical full loads with Inventory cube
    Setting up material movement/inventory with limit locking time
    Thanks
    Reddy

  • Populated the new field with historic data

    HI,
    I have a data for 2 years now I am enhancing the data source how can I populated the new field with historic data.Is it by
    1)deleting all the data in BW side and the doing an init then setting up a regular delta
    2)running a repair full request with selection condition.
    or is there any other option available? which is the best scenario to load these historic data?
    Regards,
    Ravi

    Hi,
    I think your datasource is already in production, and you want to data only from today onwards for enhanced fields, i.e. historical data is not required for enhanced fields..
    1.Fix ECC down time for 20 to 30 munities
    2. Keep all objects on Qty system in ECC and BW.
    3.Run Delta laods in BW for 2 to 3 times.So with this step you can clear SMQ1 and RSA7. Check the entries in RSA7, if it is ZERO then it is fine.
    4. Move DS from ECC Qty to ECC PROD .
    5. Replicate in BW.
    6. Move all BW objects from BW Qty to BW Prod.
    7. Delete Init load in InfoPackage level (Not in ODS/CUbe).
    8.Load Init without DataTransfer.
    9.Then Run Delta.
    10. Next day on wards deltas will come as usual.
    If you need Historical data also.
    1.Delete data in Cube.
    2.Fix Down Time and load Init then Delta.
    Check
    SAP Note 328181 - Changes to extraction structures inCustomizing Cockpit
    Thanks
    Reddy

  • Design issue  - Historical data - referencial integrity

    Hi,
    I dont know if this is the right place to post this question, but
    I would to like know about the design issues in storing historical data .
    I have a historical table about events ( notifications, alarms, etc ).
    In historical table would I store fields with foreign key to the master table ( device table ) ?
    In this design , how handle then updates ( remove record ) to the master table ?
    would I have a trigger to update the historical table too ? Or I mustn' t permit
    updates in master table primary Keys ?
    And about no store The foreign keys in the historical data ?
    What are the disvantages of this model ?
    Thank you,
    Faria

    Faria,
    The answer depends on why you want historical data.
    Is the historical data to be used for auditing or legal purposes?
    Are you trying to move historical data for performance reasons?
    Is there some other purpose?
    If you're doing this for auditing purposes, I don't recommend using foreign keys since you won't easily be able to capture delete activity.
    I mostly use Designer for all of our designs and typically will turn on server side journaling for this purpose. Designer will then generate the journaling tables and write the triggers to manage the whole process. You don't even need to be a programmer to figure it out.
    Let me know back if this isn't your purpose or you need clarification.
    Thanks, George

  • Upload Sales Historical Data in R/3?

    Hi Guys:
    We are planning to use forecased based planning MRP Procedure ( MRP Type VV ) for Material Planning , we need historical sales data to execute the forecast.
    can anybody tell me step by step how to upload 12 months historical data into SAP R/3, so that we can excute forecast based planning for a material?
    Thanks
    Sweth
    Edited by: Csaba Szommer on May 12, 2011 9:57 AM

    i need to to pull data in bw from r/3 for frieght rates
    example for rate is from source to dest(in rows) and on the basis of this we have counditon types(in columns) which we will put in coloums and on the base of those cound type we will get cummulated value of master data rate.
    like
    source  dest   rate(based on coundition types)
    now in r/3 some coundition type  are changed to new ones or we can say merged to new ones, now my question for new one i can pull no problem but what i'll do for the historical master data like coundition type in 2006 are diff and 2007 is diff
    but they mapped in each othe
    thx
    rubane

Maybe you are looking for

  • Create Support Message from external system

    HI experts, I would like to change the Create Support Message screen(Menu->Help->Create Support Message) and add three fields namely: Category, Subject and Solution Number because currently these fields are not populated in the Service desk if I crea

  • Compaq Mini 110 won't start after cleaning Think Point virus

    Hi I'm helping my daughter with the following problem with her Compaq mini 110. After removing the Think Point virus according to advice in different places the mini 110 don't boot after power on. The choice of pressing F10 (BIOS setup) and F9 (boot

  • Reading back file for analysis purposes controlling X-axis

    Alright, I lose using labview.  I use it for a ton of different projects at work.  I have been fooling around with this for a while and maybe I am just missing it. Here goes, in one of my applications I record 4 AI channels at various rates (6, 600,

  • Contact Form misaligned

    Published test site. Contact form misaligned on page. Chrome and Safari. Need fix for client review. Solutions??

  • Spry menu scrollbar on top problem

    Hello to everyone, I have spry tab menus (vertical) with scrollable text. If I scroll to the bottom of tab 1, then select tab 2, the scrollbar remains at the bottom (or wherever it was last positioned on tab 1. You can see the problem on http://www.o