Best Way to Drop Large Clob Column?

I have a very large partitioned table that contains XML documents stored in a clob column. Aside from the clob column there are several varchar and numeric columns in the table that are related to each document. We have decided to move the XML out of Oracle and into text files on the OS but want to keep the other data in Oracle. Each partition has a tablespace for the clob column and a tablespace for the other columns.
What is the best (quickest/most efficient) way to drop the clob column and free up the space that it is currently using?
OS: HP-UX
Oracle: 11.2.0.3
Table Partitions: 27
Table Rows: 550,000,000
Table Size: around 15 TB with 95% of that found in the column to drop
One other wrinkle, there are several tables that have a foreign key relationship back to the primary key of the table in question. Three of those tables are multi-billion rows in size.

Hi,
You can use the mark unused column,and checkpoint in the drop column statements,
please visit the link. may it help you
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:623063677753

Similar Messages

  • Best way to copy large folders to external drive?

    What is the best way to transfer large folders across to external drives?
    I'm trying to clean up my internal hard drive and want to copy a folder (including its sub folders) of approx 10GB of video files to an external firewire drive.
    I've tried drag and drop in finder (both copy and move) and it gets stuck on a particular file and then aborts the whole process.
    1. Surely it should copy everything it can, and simply not copy the problem file.
    2. Is there a way I can do copy and verify, a bit like when I burn a disk so I can be sure the video files have transferred safely before I delete them from my internal drive?
    Many thanks in advance for any advice.

    What you are trying to do makes perfect sense to me and I have done the same prior to getting myself a Time Machine system in place.
    1. Surely it should copy everything it can, and simply not copy the problem file.
    The fact that it is getting stuck on a particular file suggests that there is a problem with it. Try to identify which one it is and deal with that file on it's own. It could be that there is a disk error where that file is stored.
    2. Is there a way I can do copy and verify....
    The copy process you are using does that implicitly as I understand it.
    Chris

  • What is the Best way to move large mailboxes between datacenters?

    What is the Best way to move large mailboxes between datacenters?

    Hi, 
     Are you asking with regards to on-premises Exchange? With Microsoft Online SaaS services (aka Exchange Online) there is no control and no need to control which data center a mailbox resides in.
     With regard to on-premises Exchange, you have two choices: you can move it over the WAN in which case you would either do a native mailbox move (assuming you have Exchange 2010 or later you can suspend the move after the copy so you can control the
    time of the cutover) or create a database copy in the second data center and once the database copies have synchronized change the active copy.
    The other choice is to move is out of band which would usually involve an offline seed of the database (you could conceivably move via PST file but that would disrupt access to the mailbox and is not really the 'best way').
    In general, Exchange on-premises questions are best asked on the Exchange forum: http://social.technet.microsoft.com/Forums/office/en-US/home?category=exchangeserver
    Thanks,
    Guy 

  • What is best way dealing with large tiff file in OSX Lion?

    I'm working with large tiff  file (engineering drawing), but preview couldnt handle (unresponsive) this situation.
    What is best way dealing with large tiff file in OSX Lion? (Viewing only or simple editing)
    Thx,
    54n9471

    Use an iPad and this app http://itunes.apple.com/WebObjects/MZStore.woa/wa/viewSoftware?id=400600005&mt=8

  • What is the best way to drop and recreate a Primary Key in the Replication Table?

    I have a requirement to drop and recreate a primary key in a table which is part of Transaction replication. What is the best way to fo it other than remove it from replication and add again?
    Thanks
    Swapna

    Hi Swapna,
    Unfortunately you cannot drop columns used in a primary key from articles in transactional replication.  This is covered in
    Make Schema Changes on Publication Databases:
    You cannot drop columns used in a primary key from articles in transactional publications, because they are used by replication.
    You will need to drop the article from the publication, drop and recreate the primary key, and add the article back into the publication.
    To avoid having to send a snapshot down to the subscriber(s), you could specify the option 'replication support only' for the subscription.  This would require the primary key be modified at the subscriber as well prior to adding the article back in
    and should be done during a maintenance window when no activity is occurring on the published tables.
    I suggest testing this out in your test environment first, prior to deploying to production.
    Brandon Williams (blog |
    linkedin)

  • Best Way to Copy Large Files

    In copying files to a USB memory stick, is there a particular file size limit above which simple drag and drop using the Finder does not work consistently? In general can we assume that if it appears to copy OK, then the files are in good shape? For example, if I'm copying a Documents folder with 5 GB of data onto an 8 GB USB stick would it be appropriate to use drag and drop? What are the alternative ways to copy large blocks of data? Is there a way to use Disk Utility for this?
    I should mention one of the reasons I'm asking. I've occasionally noticed large copying procedures sometimes cause the Finder to "hang."
    I do have SuperDuper which I use to clone my internal HD to a peripheral HD on a regular basis. Would that software provide a way to copy single files or folders containing lots of files to my USB memory stick?
    Thanks,
    Steve M.
    Message was edited by: Steve M.

    Drag and drop is fine. There's really no good reason to do it a different way.
    The only file limit is if the USB stick is formatted FAT32. Then 4GB minus 1 byte is the largest file size it can hold. If you format the stick as Mac OS Extended (HFS+), there's no limit to the file size, just the capacity of the drive itself.
    If you need to share between Mac and Windows, then formatting the stick as ExFAT is best. The only catch is that versions of OS X earlier than Snow Leopard will not be able to read the drive. Windows XP, Vista and Win 7 can read ExFAT. There is also essentially no file size limit with ExFAT.
    As far as the hanging, that may be the OS stalling on a file larger than 4 GB if the USB stick is formatted as FAT32.

  • What is the best way to send large, HD video files from the iPad?

    Hello,
    My department is looking to purchase between 50 and 150 new iPads. We hold conferences where participants take short, 3-minute videos of themselves in situations they have been trained for. The idea was that after they have taken the video, they could simply email these videos to themselves, negating the need to hook the device up to a computer, transfer the video over to a flash drive, and ensure that drive gets back to the right person.
    Now that I have the first one, I'm finding any video longer than 45s or so can't be sent via email because it's too large.
    Are you kidding?
    The file I was just able to send was 7.5MB. That's the biggest video I can send from this thing? Why even give us the ability to take HD video if we can't *do* anything with it?
    But, I digress.
    What I'm looking for is the best way to get video off the iPads and into the hands of our participants. Are there apps on the App Store which might convert whatever video we take to an emailable size? I thought about setting everyone up with Apple IDs and having them use iMessage since there is no limit on file size, but of course you must own an Apple device in order to do that, and it still doesn't solve getting it on their computers.
    Any help here would be greatly appreciated.
    Thanks!
    --Rob

    You could try uploading them to a joint dropbox account that they could then download them from....but how fast it is will depend on the internet connection available.

  • Best way to delete large number of records but not interfere with tlog backups on a schedule

    Ive inherited a system with multiple databases and there are db and tlog backups that run on schedules.  There is a list of tables that need a lot of records purged from them.  What would be a good approach to use for deleting the old records?
    Ive been digging through old posts, reading best practices etc, but still not sure the best way to attack it.
    Approach #1
    A one-time delete that did everything.  Delete all the old records, in batches of say 50,000 at a time.
    After each run through all the tables for that DB, execute a tlog backup.
    Approach #2
    Create a job that does a similar process as above, except dont loop.  Only do the batch once.  Have the job scheduled to start say on the half hour, assuming the tlog backups run every hour.
    Note:
    Some of these (well, most) are going to have relations on them.

    Hi shiftbit,
    According to your description, in my opinion, the type of this question is changed to discussion. It will be better and 
    more experts will focus on this issue and assist you. When delete large number of records from tables, you can use bulk deletions that it would not make the transaction log growing and runing out of disk space. You can
    take the table offline for maintenance, a complete reorganization is always best because it does the delete and places the table back into a pristine state. 
    For more information about deleting a large number of records without affecting the transaction log.
    http://www.virtualobjectives.com.au/sqlserver/deleting_records_from_a_large_table.htm
    Hope it can help.
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Best way to move large iTunes library?

    I have a large (nearly 1Tb) iTunes library of music, videos and TV shows and have almost reached the capacity of the HDD they currently eside upon.  At the moment they ae on a Wester Digital My Book, attached by USB to my TimeCapsule.  I have just ordered a larger La Cie drive to put them on and plan to attach this to my iMac directly, by Firewire (I don't have a Thunderbolt iMac).  The TimeCapsule is attached to the iMac by Ethernet cable.
    I know the best way to move an iTunes library is to point the library at the new location and then 'Organise Library' from the File menu.  I suspect this will take a while, with so much data to transfer.  I am wondering if I would be better to attach both drives to my computer directly and transfer the files across, then point the iTunes library at the new location - would this work?  It is vital that whatever I do keeps all the metadat intact, as I spent ages tagging all my DVDs.  I don't care about play counts and my Playlists can easily be remade, so I don't mind losing them.
    Any advice is gatefully received,
    Steve

    I have a 270 gig iTunes folder and a 260gig Iphoto Library and used this technique to massively speed up the transfer to an external USB3 drive.  It uses terminal commands i found here:
    http://www.cnet.com/au/news/using-the-os-x-terminal-instead-of-the-finder-to-cop y-files/
    I used this command successfully to achieve 3 GB per minute transfer.
    First open Terminal app (easy to find in spotlight)
    Type the following noting that all content in () is an instruction for you to complete or a spacebar between text. "c (space) -av (space) (drag source folder directory onto open terminal app window) /* (space) (drag destination folder onto open terminal app window) /"
    Your transfer should begin
    NOTE: If you are using this for your iPhoto or Aperture libraries there are many index files containing meta data of your photos.  This may appear to not transfer large amounts of data while transferring these files before getting to the .jpg or .RAW files.  Don't worry, be patient, once these are done the speed will pick up considerably.
    I hope this helps,
    Matthew

  • Best way to manage large library with AAC and MP3 versions of files

    I have a large library of music. I listen to music through iTunes and iPod/iPhones, but also other devices which only support MP3, and not AAC. I typically convert iTunes purchases to MP3, but I'm left with two copies of each song in the same folder. I'm looking for recommendations on the best way to manage what would essentially be a duplicated music library - a high quality version I use with my iTunes/ipods, and a lower quality MP3 version for the devices that only support that format.

    I have had a similar problem. I have all my music residing on a NAS where I access it from iTunes (for managing my iPods/iPhones) and a Tivo PVR (which is connected to my house stereo system.) The problem is that Tivo does not recognize AAC. So I've used iTunes to create mp3 versions of all my purchased AAC music. Because the NAS is set as my iTunes media folder location, iTunes puts the mp3 copies back into the existing folder structure (which it keeps organised) on the NAS. Tivo accesses the NAS directly, over the network. But iTunes also puts the additional copy into the iTunes library which leaves me with multiple copies of the same songs on iTunes AND my iPods. Having multiple copies on the NAS (mp3 for Tivo and AAC for iPod) is OK: I've got plenty of space on the NAS and the Tivo doesn't 'see' the AAC files.
    The solution I'm planning to implement is to delete the mp3 entries from the library as soon as I create them (leaving the files in the media folder.) But what I'm looking is for a way to automatically select the mp3 copies (for deletion) from the existing duplicates list. Any ideas?

  • Best way to handle large list of results in recordsets?

    Hello all.
    I'm using Dreamweaver CS3, MySQL and ASP/VBScript.
    My database of users behind my website is now approaching 25,000.
    I often have to "move" items in the database from one user record to another.
    Up and until now, I've done this simply by way of a drop down menu/list that is populated with the user ID# and Name of each and every user in the database.   This previously allowed me to simply select the ID of the Customer I wanted to "Move" the record to.
    The problem, is that the system is of course now trying to load a list of almost 25,000 user ID's each time I view the relevant site page, which is now taking so long to load it's uncomfortable.
    I've seen other sites that allow you to start typing something in to a text box and it starts filtering the results that match as you type, showing a list below.
    I assume (but am happy to be advised otherwise) that this is likely to be my best way forward, but I haven't the first clue how to do it.
    Can anyone advise?
    Regards
    David.

    You're looking for a 'type ahead' control. Try searching the web, although you may have trouble finding example code for classic asp. I did find some asp.net solutions out there.

  • Best way to handle large number video files for a project..

    Hey, I was looking at getting some insight from the community here. Bascially there is a project that is being worked on that requires large amount of footage to be sifted through that only a small percentage will be used. These are mostly HD files and while most of the footage has been watched on Quicktime with notes taken, my question is this.
    What is the best way to take only small portions of each file without having to load everything into final cut and without any loose of quality. Should I just trim and rename from Quicktime or is there an easier way?
    Reason this needs to be done this way is the smaller segments will each be sent to other editors and rather then send huge files we want to split it to smaller amounts for each editor to use.
    Thank you so much for any input regarding this, I look forward to what you have to say

    Open the clip into the viewer. Mark In and Out points on the section you want. Make it a subclip Cmd-U. Drag the subclip into the bin for the editor who needs it. Repeat.
    If you batch export from a clip there is a selection to choose whether to export the whole clip or check box to export the marked I/O.
    This does not sound like a good project on which to being learning FCP.

  • Best way to handle large files in FCE HD and iDVD.

    Hi everyone,
    I have just finished working on a holiday movie that my octagenarian parents took. They presented me with about 100 minutes of raw footage that I have managed to edit down to 64 minutes. They have viewed the final version that I recorded back to tape for them. They now want to know if I can put it onto a DVD for them as well. Problem is the FCE HD file is 13Gb.
    So here is my question.
    What is the best way to handle this problem?
    I have spoken to a friend of mine who is a professional editor. She said reduce the movie duration down to about 15mins because it's probably too long and boring. (rather hurtful really) Anyway that is out of the question as far as my oldies are concerned.
    I have seen info on Toast 8 that mentions a "Fit to DVD" process that purports to "squash" 9Gb of movie to a 4.7Gb disk. I can't find if it will also put 13Gb onto a dual layer 8.5Gb disk.
    Do I have to split the movie into two parts and make two dual layer DVD's? If so I have to ask - How come "Titanic", 3hrs+ fits on one disk??
    Have I asked too many questions?

    Take a deep breath. Relax. All is fine.
    iDVD does not look at the size of your video file, it looks at the length. iDVD can accomodate up to 2 hours of movie
    iDVD gives you different options depending on the length of your movie. Although I won't agree with your friend about reducing the length of your movie to 15 minutes, if you could trim out a few minutes to get it under an hour that setting in iDVD (Best Performance though the new version may have renamed it) gives you the best quality. Still, any iDVD setting will give you good quality even at 64 minutes
    In FCE export as Quicktime Movie NOT any flavour of Quicktime Conversion. Select chapter markers if you have them. If everything is on one system unchecked the Make Movie Self Contained button. Drop the QT file into iDVD

  • Whats the best way to save large folders/files on multiple dvds?

    hi,
    A quick question about the best way to save to dvd(4.7gb disc) a large number of folders/files.I've got about 30-40gb of stuff to save from mp3 sample files through to logic audio files.I already have them saved to another HD but i just want to be extra safe and have another back up.In the past i've just saved it disc by disc creating each folder as i go dragging the appropriate folder across.I was just wondering whether there was an easier way than this.May be an application where i can drag all the files and it will sort and save everything for me much like the way itunes now saves/backups files.
    thanks
    darren

    Hi darrenz,
    if you have Toast you can save your stuff using Toast. There's a great option within Toast. You need to select the Data tab and choose "Mac only". Then just drag all the stuff you want to burn to the Toast window and it will automatically ask for another DVD until everything is burnt. After that all your DVDs will have a Toast Restore app on them. It lets you select what you want to restore and will tell you which DVDs to insert. Here's a screenshot I found by googling: http://www.macworld.com/images/content/2005/08/19/toast7restore.jpg
    Have fun!
    Björn

  • Best way to split large partition in multiple small partitions

    Looking for some ideas here. We ended up having a large/huge partition, the last one (because of maxvalue), while other partitions have similar number of records. What's the best way to split this large partitions with has about 3 years worth of data into separate partitions based on Month/Year. I tried with SPLIT partition, I not able to figure out a way to specify a value where I can like to split.
    Any thoughts, ideas appreciated.
    Thanks

    I'd be inclined to split off that partition as a stand-alone table. Create new partitions and then reload the data. But that "inclined" would depend on the available maintenance window and the amount of data and the Oracle version number none of which you mention.

Maybe you are looking for

  • Bib.dll crashing PS 5.1

    Hey everyone, I've come across what I strongly suspect is a weird font issue that's preventing me from working. Until today, PS has never acted up before. I'm running 64-bit Photoshop, v12.1. My work .psd was based on a toolkit .psd off the web - I w

  • PI 7.1: WS RM Adapter to IDOC SAP Integration

    Hi All, I had a few questions on WSRM-IDOC integration in PI 7.1. This is mainly related to understanding of WS-RM Adapter integration. Please provide your thoughts and inputs to the following questions: (a) I am under the impression that WS -RM adap

  • Files not found while installing Oracle 9iAS Rel 2 on Win2K

    Hello, When installing Oracle 9iAs Release 9.2.0.1.0 on Windows 2000, the installation pops up windows 'File not found' for a number of files like - jarfiles.1.4.jar, jarfiles.1.2.jar. I checked out and the files are not avialable in the installation

  • Report totals with a twist

    Good Morning, I have a report, that I want to display the totals in the footer of the report. The main query is below. The report breaks on the the first column. This a classic report, using APEX 3.2.1. The report works correctly, but now I want to b

  • Material on SAP Query

    Hi, I need material on SAP Query (not BW!!). Can you provide me with a link, or training material? Thank you. Tad