Best way to split large partition in multiple small partitions

Looking for some ideas here. We ended up having a large/huge partition, the last one (because of maxvalue), while other partitions have similar number of records. What's the best way to split this large partitions with has about 3 years worth of data into separate partitions based on Month/Year. I tried with SPLIT partition, I not able to figure out a way to specify a value where I can like to split.
Any thoughts, ideas appreciated.
Thanks

I'd be inclined to split off that partition as a stand-alone table. Create new partitions and then reload the data. But that "inclined" would depend on the available maintenance window and the amount of data and the Oracle version number none of which you mention.

Similar Messages

  • What is the best way to split large blocks of text into smaller blocks?

    Hello, I have several pages with 2,000 word articles of text. As you can imagine these pages are not terribly interesting design-wise, something I plan to change.
    The first thing I need to do is split up these blocks of text into much smaller blocks so that they can easily be moved around and independently manipulated. What's the quickest way to do this in Muse?

    You can create different text frames and copy the text, which if you wish you can group them together which will help in moving them around on page.
    Thanks,
    Sanjit

  • How to split Large file into Multiple small files..

    Hi,
       My source file is simple  XML structure...and target side also i need to send in XML files  .. but in the source file I'm getting all the data at a time.. so source file  is more than 1000 records.. but i want to process  50 records at a time...
    bu using FCC (RecordSet per Message)  we can   solve this.. but I dont want to  do any File conversion.. i want to send  in XML file only...
    for this.. how we can handle....
    Regards
    Jain

    Jain,
    Please see the below screenshots.
    http://www.flickr.com/photos/23855877@N07/2991137391/sizes/o/
    http://www.flickr.com/photos/23855877@N07/2991137441/sizes/o/
    http://www.flickr.com/photos/23855877@N07/2991988028/sizes/o/
    http://www.flickr.com/photos/23855877@N07/2991988084/sizes/o/
    For No, Name, City Map Directly.
    In the example, I've given 5 records to split. If you want 50 records to split, then instead of 5 give 50 in the constant.
    raj.

  • What is the Best way to move large mailboxes between datacenters?

    What is the Best way to move large mailboxes between datacenters?

    Hi, 
     Are you asking with regards to on-premises Exchange? With Microsoft Online SaaS services (aka Exchange Online) there is no control and no need to control which data center a mailbox resides in.
     With regard to on-premises Exchange, you have two choices: you can move it over the WAN in which case you would either do a native mailbox move (assuming you have Exchange 2010 or later you can suspend the move after the copy so you can control the
    time of the cutover) or create a database copy in the second data center and once the database copies have synchronized change the active copy.
    The other choice is to move is out of band which would usually involve an offline seed of the database (you could conceivably move via PST file but that would disrupt access to the mailbox and is not really the 'best way').
    In general, Exchange on-premises questions are best asked on the Exchange forum: http://social.technet.microsoft.com/Forums/office/en-US/home?category=exchangeserver
    Thanks,
    Guy 

  • What is the best way to create shared variable for multiple PXI(Real-Time) to GUI PC?

    What is the best way to create shared variable for multiple Real time (PXI) to GUI PC? I have 16 Nos of PXI system in network and 1 nos of GUI PC. I want to send command to all the PXI system with using single variable from GUI PC(Like Start Data acquisition, Stop data Acquisition) and I also want data from each PXI system to GUI PC display purpose. Can anybody suggest me best performance system configuration. Where to create variable?(Host PC or at  individual PXI system).

    Dear Ravens,
    I want to control real-time application from host(Command from GUI PC to PXI).Host PC should have access to all 16 sets PXI's variable. During communication failure with PXI, Host will stop data display for particular station.
    Ravens Fan wrote:
    Either.  For the best performance, you need to determine what that means.  Is it more important for each PXI machine to have access to the shared variable, or for the host PC to have access to all 16 sets of variables?  If you have slowdown or issue with the network communication, what kinds of problems would it cause for each machine?
    You want to located the shared variable library on whatever machine is more critical.  That is probably each PXI machine, but only you know your application.
    Ravens Fan wrote:
    Either.  For the best performance, you need to determine what that means.  Is it more important for each PXI machine to have access to the shared variable, or for the host PC to have access to all 16 sets of variables?  If you have slowdown or issue with the network communication, what kinds of problems would it cause for each machine?
    You want to located the shared variable library on whatever machine is more critical.  That is probably each PXI machine, but only you know your application.

  • Best way to split 3D project with constantly moving cameras

    I created a 1 minute project in which the camera travels between 3 sets, while there is an animated 2D background. The background consists of overlaying .psd files which move around. The camera sweeps each set and then zooms to the next. Within each set are .mov and .psd files with tons of keyframing (although not too many filters). I think I'm starting to hit the outer limits of my hardware. I have it set to render draft quality for now, and it's still functional. I'm wondering what would be the best way to split it into smaller files, yet retain smooth motion of the camera between sets? I still need to fine tune the duration of each set, so I'm a little hesitant to break it up a this point, but may need to do so anyway. Thanks in advance.

    Ok, I did it. Project complete. It turned out that I did have enough RAM, just not enough to RAM preview it all. I did flatten quite a bit of it prior to the camera moves. And an additional challenge was when I wanted to add a 5 second clip at the beginning. I wound up adding a second camera because all my timing was thrown off.
    Anyway, here's a link to the final video.
    http://www.youtube.com/watch?v=McBqVzrejNw
    Hey Bogie, I was thinking of starting a separate thread in the FCP or Compressor forums with recommended encodes for getting this monster of a file on the web. I tried several iterations and finally found one that both looked good, was big enough to watch, and small enough to load quickly. Does it make sense to start a new thread for which I already have the answer?

  • What is best way dealing with large tiff file in OSX Lion?

    I'm working with large tiff  file (engineering drawing), but preview couldnt handle (unresponsive) this situation.
    What is best way dealing with large tiff file in OSX Lion? (Viewing only or simple editing)
    Thx,
    54n9471

    Use an iPad and this app http://itunes.apple.com/WebObjects/MZStore.woa/wa/viewSoftware?id=400600005&mt=8

  • Best way to copy large folders to external drive?

    What is the best way to transfer large folders across to external drives?
    I'm trying to clean up my internal hard drive and want to copy a folder (including its sub folders) of approx 10GB of video files to an external firewire drive.
    I've tried drag and drop in finder (both copy and move) and it gets stuck on a particular file and then aborts the whole process.
    1. Surely it should copy everything it can, and simply not copy the problem file.
    2. Is there a way I can do copy and verify, a bit like when I burn a disk so I can be sure the video files have transferred safely before I delete them from my internal drive?
    Many thanks in advance for any advice.

    What you are trying to do makes perfect sense to me and I have done the same prior to getting myself a Time Machine system in place.
    1. Surely it should copy everything it can, and simply not copy the problem file.
    The fact that it is getting stuck on a particular file suggests that there is a problem with it. Try to identify which one it is and deal with that file on it's own. It could be that there is a disk error where that file is stored.
    2. Is there a way I can do copy and verify....
    The copy process you are using does that implicitly as I understand it.
    Chris

  • Query performance - A single large document VS multiple small documents

    Hi all,
    What are the performance trade offs when using a single large document VS multiple small documents ?
    I want to store xml snippets with similar structure in a container. Is there any benefit while querying, if I use a single large document to store all these snippets against adding each snippet as a different document. Would it degrade the performance when adding an xml snippet each time by modifying an existing document?
    How could we decide whether to use a single large document VS multiple small documents?
    Thanks,
    Anoop

    Hello Anoop,
    In case you wanted to get a comparison between the storage types for containers, wholedoc and node, let us know.
    What are the performance trade offs when using a
    single large document VS multiple small documents ?Depends on what is more important to you, performance when creating the container, and inserting the document(s) or performance when retrieving data.
    For querying the best option is to go with smaller documents, as node indexes would help in improving query performance.
    For inserting initial data, you can construct your large document composed of smaller xml snippets and insert the document as a whole.
    If you further want to modify this document changing its structure implies performance penalties; more indicated is to store the xml snippets as documents.
    Overall, I see no point in using a large document that will hold all of your xml snippets, so I strongly recommend going with multiple smaller documents.
    Regards,
    Andrei Costache
    Oracle Support Services

  • Whats the best way to save large folders/files on multiple dvds?

    hi,
    A quick question about the best way to save to dvd(4.7gb disc) a large number of folders/files.I've got about 30-40gb of stuff to save from mp3 sample files through to logic audio files.I already have them saved to another HD but i just want to be extra safe and have another back up.In the past i've just saved it disc by disc creating each folder as i go dragging the appropriate folder across.I was just wondering whether there was an easier way than this.May be an application where i can drag all the files and it will sort and save everything for me much like the way itunes now saves/backups files.
    thanks
    darren

    Hi darrenz,
    if you have Toast you can save your stuff using Toast. There's a great option within Toast. You need to select the Data tab and choose "Mac only". Then just drag all the stuff you want to burn to the Toast window and it will automatically ask for another DVD until everything is burnt. After that all your DVDs will have a Toast Restore app on them. It lets you select what you want to restore and will tell you which DVDs to insert. Here's a screenshot I found by googling: http://www.macworld.com/images/content/2005/08/19/toast7restore.jpg
    Have fun!
    Björn

  • Best way to set up partitions on 250GB external drive

    Hi all,
    I've decided to go with an Iomega eGo 250-GB FireWire portable drive to back up my iBook's hard disk (30 GB) plus to store my photos and maybe hold a bootable copy of OS9.
    This is my first time using an external drive, so I don't really know how to go about setting things up.
    What would be the best way to set up the partitions on the external drive?
    1) I know I should create a 30GB partition for the iBook backup, but what about for the remaining volume? Is it best to leave it as a single large 200GB partition, or split it into two 100GB chunks? Other than the clone of my iBook, I'll mainly be storing photos (at the moment no more than around 10-12GB).
    2) I don't yet have a huge iTunes collection, but if I later want to store iTunes music that doesn't fit on my internal drive, is it fine to put it in the same partition as the photos, or would it be preferable to create a separate partition?
    3) If I want to install OS9 on the drive, how much space should I allot to it? Would this enable me to run legacy apps on my iBook?
    4) Does it matter how I name the partitions (i.e. should the name for my iBook back-up be the same as the original volume or should it be different?)
    5) As I still don't know exactly what I need, If today I decide to create only 30GB partition to backup the iBook, can I later partition the remaining space without having to start all over again?
    6) Also, the drive came out of the box with Mac OS Extended format. Is this the best format or is there another that would be better?
    Thanks for any advice
    Message was edited by: Lutetia

    1) I know I should create a 30GB partition for the iBook backup, but what about for the remaining volume? Is it best to leave it as a single large 200GB partition, or split it into two 100GB chunks? Other than the clone of my iBook, I'll mainly be storing photos (at the moment no more than around 10-12GB).
    2) I don't yet have a huge iTunes collection, but if I later want to store iTunes music that doesn't fit on my internal drive, is it fine to put it in the same partition as the photos, or would it be preferable to create a separate partition?
    The answer to both question is that it's completely up to you. Partitioning or not partitioning won't affect operation significantly.
    3) If I want to install OS9 on the drive, how much space should I allot to it? Would this enable me to run legacy apps on my iBook?
    The iBook G4 can not boot from OS 9 so you would only be able to run OS 9 in the "classic" environment.
    4) Does it matter how I name the partitions (i.e. should the name for my iBook back-up be the same as the original volume or should it be different?)
    No
    5) As I still don't know exactly what I need, If today I decide to create only 30GB partition to backup the iBook, can I later partition the remaining space without having to start all over again?
    There are a few tools which promise the ability to change the partitioning without destroying the data on the drive. But you should never do this without making a backup of the data on the drive. Data loss can easily happen and in a big way.
    6) Also, the drive came out of the box with Mac OS Extended format. Is this the best format or is there another that would be better?
    That is the best format.

  • Best way to remove duplicates based on multiple tables

    Hi,
    I have a mechanism which loads flat files into multiple tables (can be up to 6 different tables) using external tables.
    Whenever a new file arrives, I need to insert duplicate rows to a side table, but the duplicate rows are to be searched in all 6 tables according to a given set of columns which exist in all of them.
    In the SQL Server Version of the same mechanism (which i'm migrating to Oracle) it uses an additional "UNIQUE" table with only 2 columns(Checksum1, Checksum2) which hold the checksum values of 2 different sets of columns per inserted record. when a new file arrives it computes these 2 checksums for every record and look it up in the unique table to avoid searching all the different tables.
    We know that working with checksums is not bulletproof but with those sets of fields it seems to work.
    My questions are:
    should I use the same checksums mechanism? if so, should I use the owa_opt_lock.checksum function to calculate the checksums?
    Or should I look for duplicates in all tables one after the other (indexing some of the columns we check for duplicates with)?
    Note:
    These tables are partitioned with day partitions and can be very large.
    Any advice would be welcome.
    Thanks.

    >
    I need to keep duplicate rows in a side table and not load them into table1...table6
    >
    Does that mean that you don't want ANY row if it has a duplicate on your 6 columns?
    Let's say I have six records that have identical values for your 6 columns. One record meets the condition for table1, one for table2 and so on.
    Do you want to keep one of these records and put the other 5 in the side table? If so, which one should be kept?
    Or do you want all 6 records put in the side table?
    You could delete the duplicates from the temp table as the first step. Or better
    1. add a new column WHICH_TABLE NUMBER to the temp table
    2. update the new column to -1 for records that are dups.
    3. update the new column (might be done with one query) to set the table number based on the conditions for each table
    4. INSERT INTO TABLE1 SELECT * FROM TEMP_TABLE WHERE WHICH_TABLE = 1
    INSERT INTO TABLE6 SELECT * FROM TEMP_TABLE WHERE WHICH_TABLE = 6
    When you are done the WHICH_TABLE will be flagged with
    1. NULL if a record was not a DUP but was not inserted into any of your tables - possible error record to examine
    2. -1 if a record was a DUP
    3. 1 - if the record went to table 1 (2 for table 2 and so on)
    This 'flag and then select' approach is more performant than deleting records after each select. Especially if the flagging can be done in one pass (full table scan).
    See this other thread (or many, many others on the net) from today for how to find and remove duplicates
    Best way of removing duplicates

  • Best way to delete large number of records but not interfere with tlog backups on a schedule

    Ive inherited a system with multiple databases and there are db and tlog backups that run on schedules.  There is a list of tables that need a lot of records purged from them.  What would be a good approach to use for deleting the old records?
    Ive been digging through old posts, reading best practices etc, but still not sure the best way to attack it.
    Approach #1
    A one-time delete that did everything.  Delete all the old records, in batches of say 50,000 at a time.
    After each run through all the tables for that DB, execute a tlog backup.
    Approach #2
    Create a job that does a similar process as above, except dont loop.  Only do the batch once.  Have the job scheduled to start say on the half hour, assuming the tlog backups run every hour.
    Note:
    Some of these (well, most) are going to have relations on them.

    Hi shiftbit,
    According to your description, in my opinion, the type of this question is changed to discussion. It will be better and 
    more experts will focus on this issue and assist you. When delete large number of records from tables, you can use bulk deletions that it would not make the transaction log growing and runing out of disk space. You can
    take the table offline for maintenance, a complete reorganization is always best because it does the delete and places the table back into a pristine state. 
    For more information about deleting a large number of records without affecting the transaction log.
    http://www.virtualobjectives.com.au/sqlserver/deleting_records_from_a_large_table.htm
    Hope it can help.
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Best way to manage large library with AAC and MP3 versions of files

    I have a large library of music. I listen to music through iTunes and iPod/iPhones, but also other devices which only support MP3, and not AAC. I typically convert iTunes purchases to MP3, but I'm left with two copies of each song in the same folder. I'm looking for recommendations on the best way to manage what would essentially be a duplicated music library - a high quality version I use with my iTunes/ipods, and a lower quality MP3 version for the devices that only support that format.

    I have had a similar problem. I have all my music residing on a NAS where I access it from iTunes (for managing my iPods/iPhones) and a Tivo PVR (which is connected to my house stereo system.) The problem is that Tivo does not recognize AAC. So I've used iTunes to create mp3 versions of all my purchased AAC music. Because the NAS is set as my iTunes media folder location, iTunes puts the mp3 copies back into the existing folder structure (which it keeps organised) on the NAS. Tivo accesses the NAS directly, over the network. But iTunes also puts the additional copy into the iTunes library which leaves me with multiple copies of the same songs on iTunes AND my iPods. Having multiple copies on the NAS (mp3 for Tivo and AAC for iPod) is OK: I've got plenty of space on the NAS and the Tivo doesn't 'see' the AAC files.
    The solution I'm planning to implement is to delete the mp3 entries from the library as soon as I create them (leaving the files in the media folder.) But what I'm looking is for a way to automatically select the mp3 copies (for deletion) from the existing duplicates list. Any ideas?

  • Best way to handle large number video files for a project..

    Hey, I was looking at getting some insight from the community here. Bascially there is a project that is being worked on that requires large amount of footage to be sifted through that only a small percentage will be used. These are mostly HD files and while most of the footage has been watched on Quicktime with notes taken, my question is this.
    What is the best way to take only small portions of each file without having to load everything into final cut and without any loose of quality. Should I just trim and rename from Quicktime or is there an easier way?
    Reason this needs to be done this way is the smaller segments will each be sent to other editors and rather then send huge files we want to split it to smaller amounts for each editor to use.
    Thank you so much for any input regarding this, I look forward to what you have to say

    Open the clip into the viewer. Mark In and Out points on the section you want. Make it a subclip Cmd-U. Drag the subclip into the bin for the editor who needs it. Repeat.
    If you batch export from a clip there is a selection to choose whether to export the whole clip or check box to export the marked I/O.
    This does not sound like a good project on which to being learning FCP.

Maybe you are looking for