P6 archiving old data

Hi,
Does any one have any experience of archiving old data to a read only database? Currently using SQL2000 and P6. Interested to know about problems encounted and any data integrity problems. Thanks

We suffer the same issue with archiving. One of our DBs is roughly 50GB in size and the time to restore it from backup alone is +6 hours , which does not include the time to pull tapes from off site. Our standard IS backup retention policy is only 6 months so what I end up with is literally hundreds of archive copies of a 20K+ activity schedule living in the production database (hence it's large size).
What we have done to try and contain the growth is every 6 months an Admin sends out an email to all users with a list of archive projects questioning which ones are still needed. Our end users then mark the archives for deletion or export. If deletion is selected the schedule is simply purged from the database. Export means the archive project is dumped to XER and burned to disc. The hope is that should we ever need to review an old archive copy that we can import it into a recent copy of the production DB. Luckily the need to bring one of these back has yet to occur.
Note: by archive project I am generally referring to either a baseline or what-if copy of a main schedule.

Similar Messages

  • Archive old data

    I would like to archive old iCal data without losing it. I would like to publish or export recent data to Google calendars for sharing with my department particularly calendars of who's on duty when. However, I have years of data in my duty calendars, which I don't want to lose in case I need that history. But I don't want to publish years of data, and Google calendars doesn't seem to like that big of an import. Is there a way to export just a date range, or is there a way to archive old data so that the duty calendar has only the past year's information?
    Thanks.

    You can export data from a certain range into an ics file and then reimport it late if needed. You can also set the dates for sync to 3 months 6 months, or 12 months.

  • Archiving old data from a partitioned table

    Hi,
    While sifting through all the options for archiving the old data from a table which is also indexed, i came across a few methods which could be used:
    1. Use a CTAS to create new tables on a different tablespace from the partitions on the exisitng table. Then swap these new tables with the data from the partitions, drop the partitions on the exisiting table (or truncate them although i m not sure which one is the recommended method),offline this tablespace and keep it archived. In case you require it in the future, again swap these partitions wih the data from the archived tables. I am not sure if i got the method correctly.
    2. Keep an export of all the partitions which need to be archived and keep that .dmp file on a storage media. Once they are exported, truncate the corresponding partitions in the original table. If required in the future, import these partitions.
    But i have one constraint on my Db which is I cannot create a new archive tablespace for holding the tables containing the partitioned data into then as I have only 1 tablespace allocated to my appplication on that DB as there are multiple apps residing on it together. Kindly suggest what option is the best suited for me now. Should I go with option 2?
    Thanks in advance.

    Hi,
    Thanks a bunch for all your replies. Yeah, I am planning to go ahead with the option 2. Below is the method I have decided upon. Kindly verify is my line of understanding is correct:
    1. export the partition using the clause:
    exp ID/passwd file=abc.dmp log=abc.log tables=schemaname.tablename:partition_name rows=yes indexes=yes
    2. Then drop this partition on the original table using the statement:
    ALTER TABLE tablename drop PARTITION partition_name UPDATE GLOBAL INDEXES;
    If I now want to import this dump file into my original table again, I will first have to create this partition on my table using the statement:
    3. ALTER TABLE tablename ADD PARTITION partition_name VALUES LESS THAN ( '<<>>' ) ;
    4. Then import the data into that partition using:
    imp ID/passwd FILE=abc.dmp log=xyz.log TABLES=schemaname.tablename:partition_name IGNORE=y
    Now my query here is that this partitioned table has a global index associated with it. So once i create this partition and import the data into it, do i need to drop and recreate the global indexes on this table or is there any aother method to update the indexes. Kindly suggest.
    Thanks again!! :)

  • Archiving old data from a main database into an archived database

    Hello colleagues,
    We are trying to create a stored procedure to archive data older than 6 months (180 days) from our production database in to a new archive database.
    We want to archive only 20,000 rows a day and we need to schedule it on a daily basis. We also want to delete those archived rows from the production database.
    Could you please share us your experience on archiving.
    Thanks

    Hi BG516, 
    Ok, I got your point now :) 
    First, how long does it take to read these 20.000 rows? It shouldn't be a lot, especially if the table is well indexed to cover that query (an index on the date column and covering the rest of the table, basically). There are many aspects that may affect
    the process but my guess is that the big deal would be to delete these old rows from your production table. 
    Reading these rows will require a shared latch, and if you're reading old data your daily processes shouldn't be trying to write in these particular pages (again, depends on the indexes mainly). Deleting them will need an exclusive lock and that'd be more
    problematic, reads are quite more common than writes in a datawarehouse. 
    When facing this kind of problem, I always had to find a non-peak period of time to execute the required processes. 
    A few things that come to my mind: 
    - Use BULK INSERT when loading the data into your historical table so you can minimize the time you spend
    reading from the production table
    - Check the number of indexes you'll impact when deleting these rows. The more, the worse (more time
    needed to maintain them)
    - What version of SQL Server are you using? The Elastic Scale feature from Azure SQL Database covers just
    that scenario (http://channel9.msdn.com/Shows/Data-Exposed/Azure-SQL-Database-Elastic-Scale)
    Regards.
    Pau.

  • Can I start up intel imac mid 2010 model running os 10.7 with a verbatim external drive running os 10.6 so I can run some older power pc apps to get and archive old data

    can I start up an intel imac mid 2010 model running 10.74 with an external verbatim drive running os 10.6 so I can access data on some powerpc applications. If so any help on procedure would be great. thanks for any help

    Yes, that model can be booted into Snow Leopard 10.6.3 or later. Use the discs that came with the computer to install a compatible version of OS X for the model.

  • Can you archive old calendar items in Entourage so you don't run out of space on your Blackberry?

    Hi everyone,
    I have a Sprint Pearl 8130 and I am syncing with PocketMac to Entourage 2004 on an iBook G4 running OS 10.3.
    Most of my syncing issues are minor (for some reason, in contacts, work and home email addresses switch fields, but I can live with that) and I don't have too many complaints about PocketMac for now....
    My question is - Does anyone know how to archive old calendar items so that my Blackberry doesn't run out of space? I have calendar items going back to the beginning of 2008.  I don't need past calendar items on my Blackberry, except for reoccurring items, like birthdays and such, and maybe a month's worth of old appointments just for reference.  My Blackberry has run out of space a couple of times and when it deletes the appointments from my Blackberry, syncing wants to either delete from the Mac or restore to the Blackberry.  I know you can do this with a Palm and their Palm software, since I used to have one and I was able to archive items older than a week.  Then you would be able to access them in a separate archive profile.
    I have searched everywhere and the closest thing I can find is to purchase Missing Sync and set to match my sync date range.  https://support.markspace.com/index.php?_m=knowledgebase&_a=viewarticle&kbarticleid=327.  But that doesn't really solve my issue of running out of space on my Blackberry.  The other thing that I could think of is to export the calendar items into a .rge file, then delete the calendar items from there, sync, and have PocketMac remove items from my Blackberry, but it would be a pain to easily access old calendar items.
    I appreciate any thoughts, ideas and especially solutions!

    I have the exact opposite issue you mentioned
    my client wants his archived calender items to be synched to his 8830 world wide ed.
    he is running RIM software (DesktopManager) v 6.4
    BES 2007
    current issue is that his archived old calendars are not synchronizing to his handheld; we tried to force his PC to synch those items manually through cable, which didn't help, since our BES policy force waireless synch policy to the device and wireless synch never can see the archived items files
    I think if you can set the option on your device so it turns on the calendar wireless synch, and also make sure MAC does not select archive to synch
    Calendar > Options > General settings or /Wireless synch set to ON;
    if you running BES set Calendar Wireless synch off

  • How can I delete old data in citadel

    How can I delete old data from Citadel so that HDD is not full when data is continously being logged in the database ?

    Hi there,
    I could see different possible ways to limit the database size on a LabVIEW DSC system.
    a) You could limit the lifespan of your logged data to a certain time by adjusting the option 'Days to keep historical data' in the Tag Configuration Editor>Configure>Historical... tab.
    b) You could use the Citadel Database VIs. Archive Database.vi would allow you to archive a database or a part (certain traces/tags, or certain time interval) to another location. With the archive option 'destructive', it would delete the source of the data from the original database.
    Delete Traces.vi would delete a traces from a database.
    c) The same/similar functionality you can have throught the Measurement and Automation Explorer and its Historical Data Viewer plug-in
    b) and c) are NEW features of LabVIEW DSC 6.1!
    Hope this helps
    Roland
    PS: There is a nice link about archiving Citadel on ni.com :
    http://zone.ni.com/devzone/conceptd.nsf/webmain/2F24997EAD7C53A686256B6E00686D64?opendocument&node=DZ52101_US

  • Is there any way so that I Being able to clear out old data from the Order

    Hi Friends,
    Is there any way so that I Being able to clear out old data from the Order Gate forecast, tolerance, and confirmed order to date tables on a monthly basis?
    Means I want data in data table for particular time (I want that if suppose today is 13-11-2008 then my data table must contain data from 13-10-2008 to 13-11-2008.
    Is it possible thru ABAP development or thru any t codes ?
    Any pointers will be rewarded.
    Regards

    hi,
    Before archiving you have to put the Deletion flag for the object(May be for PO, MMR etc as per your req.)...
    After that the you use the archiving procedure to archive the docs which are already flagged for deletion...
    For MMR you use the transaction SARA...and its respective object for the archiving....
    You can use the SDN search to see the many threads on it...
    Or use this link as per your req. and make the search...
    http://help.sap.com/saphelp_erp2004/helpdata/EN/75/ee0fa855c811d189900000e8322d00/frameset.htm
    Regards
    Priyanka.P

  • IDOC capturing the old data from Vendor master instead of modified data

    Hello Friends,
    Could you please guide me on the below issue  ?
    Vendor master was changed with the address being modified at 1300 system time.   => change log shows as 1300
    Payment run and IDOC generation ( RFFOEDI1 ) was executed at 1400 system time.  =>  IDOC created time as 1400
    but  , Still the  IDOC captured the old vendor master address data onto the IDOC .
    this is very strange to me and no idea how this can happen.
    thanks
    Raghu V
    Edited by: Raghunandan Vasudevarao on Jan 27, 2011 11:29 AM

    Moderator message: I am sorry but I had to remove the "correct answer" as it was not correct.
    It is well possible to remove the purchasing view via archiving, since archiving of vendor masters is divided into 3 activities: archiving purchasing view, archiving company code views, archiving general data.
    Of course there are preconditions: you cannot archive purchasing data view if you had already created orders. In that case you have to archive the orders first.
    See the network graphic,  in SARA enter object FI_ACCPAYB, then click the icon for network.
    In that particular case you would even create inconsistencies if you delete a table entry.
    And before you delete table entries you should in general know about the relationship of tables.
    Just thinking that a vendor master is LFA1, LFB1 and LFM1 is much to short. Goto DB15, enter FI_ACCPAYB (this is the archiving object) in the lower part and see how much tables are returned. These are all tables that can have data if you maintain a vendor master.
    Now check all those tables if they have a relation to LFM1 and if they contain data
    what will happen if you delete the vendor despite of existing orders? you probably get a dump when you access the purchase order.

  • Best way to remove old data from db?

    Hi,
    I am looking some ways to keep my database small and remove old data from database and archive it somewhere else. the data may need to be put back to the database in the future. One of the things I am conserned is how to enforce constraints? In other words, when I remove data, all the related pk/fk data should be removed together. Any suggestions will be greatly appreciated.
    Thanks

    hope this may help u
    concept is : exp/imp & truncate will reclaim the space
    method1
    do an export and secure the dump(say dump1) for future purpose.
    delete the unwanted data.
    commit.
    now again export the tables ( say dump2).
    now do the import(dump2).
    /*now this import will reclaim the the unused space.*/
    method 2
    /* depennded on volume of data */
    -- create backup of existing table for future
    create table backup_table nologging as select * from original_table;
    -- create copy1 of existing table
    create table copy_table nologging as select * from original_table;
    -- truncate the original table, so that highwater mark is reset.
    truncate table original_table;
    -- insert data from copy_table into original table
    insert /*+ APPEND */ into original_table select * copy_table where(ur conditions for data);
    commit all the work.
    /* now u have 3 tables
    backup_table - with all the data - for future reload
    original_table - with unwanted data deleted and space reclaimed.
    copy_table - copy of original table.
    now u drop the copy_table
    drop table copy_table;
    regards

  • I am getting a message that sys startup disk is almost full. I cleaned all my old emails out, archived old ones i needed and erased most files on the computer. I restarted but I'm still getting the message

    I am getting a message that sys startup disk is almost full. I cleaned all my old emails out, archived old ones i needed and erased most files on the computer. I restarted but I'm still getting the message

    Reindex Spotlight again.
    http://support.apple.com/en-us/HT201716
    Do not concern yourself about the backups.  They will be deleted automatically if space is needed for other data:
    http://support.apple.com/en-us/HT202301
    Ciao.

  • Appraoch for archiving table data from a oracledb and load in archiv server

    Hi,
    I've a requirement where I need to archive and purge old data in datawarehouse.
    The archival strategy will select data from a list of tables and load into files.
    I also need to use sql loader control files to load data from the above files into archival server.
    I want to know which is the better approach to load data into files.
    Should I use utl_file or spool table data into file using select stmts.
    I also have some clob columns in some tables.

    I've doing something like this a couple of months ago. After some performance tests: the fastest way is to create files through UTL_FILE used in procedure with BULK SQL. Another good idea is to create files with Python which operates on text files blazingly. (Use PL/SQL if you need to create that files on server, but if you have to create files on remote machine use something else (Python?))

  • Is there anyway to archive old queries and workbooks rather than deleting?

    Our environment has thousands of old queries and workbooks.  We plan to delete any object that has not been used within 13 months, but there's a concern we may someday need a few of the reports that will be deleted.
    Is there a way to archive old queries and workbooks rather than deleting them?
    Is there another approach to solving this matter?
    Thanks!

    Jim 
    This is really a intersting question as this is a reality as many BW reports are not used/accessed by users but still they have tendancies of sitting on these reports.
      I wanted to suggest to carry-out report usabilty analys. This analysis u can carry out by using BW statistics/standerd queries/new queries in technical cubes showing usability data. If u found that a perticalar BW report is not been used for long time, u can discuss the usability of this report with the report user ( as it is a normal practice of owning BW report by user/user group across industry ), if user is no longer uses it Or do not need it any more, it will be better to delete those reports, this will free up some resources and will reduce the junk. ( if user wants these queries ask him to use it or it will better to personalise report in his desktop easing him to see those reports. )
    Changing the technical names of the query as suggested by Bhanu will have follwing effects.
    1. It will add more reports in repository leading to junk
    2. Old queries still be assigned to user roles, so collide with user authorisation.
    hope views are clear, others are welcome to add or share their experince in this regard.
    kindly assign the points if it helps.

  • Problems archiving old messages at the end of the year?

    At the end of 2007 I'd like to create new mailboxes for messages sent and received in the year, date them, and move the messages across into them; leaving the Inbox and Sent box empty and ready to handle messages for 2008. Is this a reasonably straightforward thing to do? Are there potential pitfalls I should be aware of?

    Yes, it’s reasonable and straightforward, and is also what I do, except I don’t wait till the end of the year to archive my messages. Instead, I create a (white) folder for each year, including the current one, and organize my mail by topic into (blue) sub-mailboxes there as I read them. My Inbox only contains messages (if any) that are waiting for me to do something with them.
    Are there potential pitfalls I should be aware of?
    The main pitfall you should be aware of would be not doing that and leaving all your mail in the account mailboxes indefinitely. Using Inbox and Sent for archiving purposes (which a lot of people do) is a bad idea.
    Another thing to be aware of is that you shouldn’t use the same folder as both a mailbox (i.e. to store messages of its own) and as a parent folder for other folders. Mail allows that duality, but doesn’t handle it very well, and it’s better to not try to “take advantage” of it.
    You may also want to read this:
    Blue and white Mail folders explained
    BTW, the title of your post is “problems archiving old messages at the end of the year?

  • Solution for purging/archiving old messages from ServiceLink

    Hello there,
    We're looking for some advice on how to purge or archive old messages from ServiceLink.  We're looking to retain one month's worth of data.
    Is it possible to do this?
    Many thanks.

    what version of RC are you running?   The maintenance scripts should be included with later releases under the parent installation directory... (\\newScaleXX\schema\util\slcleanup\)

Maybe you are looking for

  • How to get no of days using oracle date function

    Dear all, i need different output from this query mentioned below. suppose i have execute this query. sql> select add_month(SYSDATE, -1 * 6) finaldate from dual; it will give the output as sql>FINALDATE 05-JUL-2009 13:46 now i need to get the total n

  • Connecting to wifi on ios5

    I have downloaded the new ios5 on my 3gs and it cant seem to connect on to my wifi? Could anyone help with this??

  • Problem in jdk1.6 installation on win NT service pack 6

    Hi All, I am trying to install jdk1.6 on windows NT service pack 6. First it ask for windows installer 2.0 or later then i install intaller 2.0. Now when i try to install jdk1.6 it ask for some dll(dll name does not specified). Can anybody tell me wh

  • Indesign CS4 Database Error code 3

    Hello all, i have a problem, i want open a indesign document that i have create for a few weeks, but wenn i will open it i get news " Coud not compete yor request because of a database error the file "xy.indd" is damaged(Error code: 3) and than Indes

  • Open multiple reports in WEBI Through URL

    Hi All, We are working on BO XI R3.1 WebI reports. We are trying to open multiple reportparts using opendocument url. We tested with juts one report part and it worked fine, but so far, the attempt for multiple report parts using same url has never s