Find Big Files?

Hey everyone!
Quick question for you. As the primary use of my Apple iMac G5 is to make movies, my iMac fills up its 160GB hard drive quite quickly. I went through and deleted a bunch of Final Cut files... bringing my free space from 20GB to 85GB... but I don't know what other big files are on here! Is there anyway I can look in finder or something at the biggest files on my system? I just want everything thats over a couple gigs to come up.. Any suggestions?
Thanks a lot.
-Dean

This little app, Where is my disk space? will index your entire hard drive based on file size. The demo version will give you basic information if all you need to do is locate things & then manually delete them.
EDIT: More info

Similar Messages

  • Navigating the Macintosh HD - How to find big files?

    Hi all,
    My Mac mini has swallowed a 7gb file by accident... it was a DVD image I planned to watch and delete, but I forgot the location of the file on the HD.
    The naming convention will make it difficult for me to search by name.
    Is there a way to search the drive by file size??
    Steve

    Indeed, easy to find that way. If, on the other hand, you're looking more in general than for a specific file, there's a useful utility to have around called WhatSize that can help track what files and folders are taking what amount of space. That can be helpful if trying to work out why available drive space has decreased! WhatSize can be downloaded from www.versiontracker.com

  • I just got a new Mac Mini, and now when I try to import photos into Lightroom, iPhoto try to load them first, then Lightroom can't find the files.  Big problem.  What to do?

    I just got a new Mac Mini, and now when I try to import photos into Lightroom, iPhoto ties to load them first, then Lightroom can't find the files.  Big problem.  What to do?

    Simple:
    File -> Export to get your photos out of the iPhoto Library. In the Export dialogue you can set the kind to Original and you'll get back exactly what you put in.
    Apps like iPhoto2Disk or PhotoShare will help you export to a Folder tree matching your Events.
    Once exported, you can then trash the iPhoto Library - just drag it from the Pictures Folder to the trash and empty it.
    After that, if you're entirely neurotic about it, just put the iPhoto app in the trash and empty it.
    Regards
    TD

  • Can't find imovie file on my hard drive

    I have a imovie file on my hard drive that I can't find. I really want to delete it so
    that I can free up some room on my hard rive. It's at least 15 gigs big. I tried using the search icon in the upper right hand corner and I tried using the 'find' option under the 'finder' menu, searching for any files over 5 gigs in size. Can anyone help me find this file so I can get rid of it?
    [email protected]

    An iMovie project is a package, not a file; the actual files inside it are smaller. Use a tool such as WhatSize to locate it.
    (23171)

  • My hard disk suddenly filled up and I can't find the files

    that are causing my hard disk to fill up. I was watching some videos that was on an external drive on my computer using vlc. I finished one movie (.avi file) and then tried to watch the second one in the series. However, the vlc application would not open it. I tried several times. I gave up and tried opening it with the quicktime player. Even though it opened, I got a message that my startup disk was full. That was strange because I am sure I had about 20GB as I was keeping a close eye on my hard disk capacity. I had recently deleted many video files and so I had a rough idea of how much hard disk I should have.
    Anyway, I deleted some movie files to make space and then I checked my hard disk with Disk Inventory X.
    When I did that, I found some strange files.
    There are some temp files some of which are 1GB. The path is private/var/vm/tmp. One file is labeled a Quick Time player launcher document.
    Another one is called "sleepimage". Format is HFS+. It is 4GB. It was modified recently (yesterday). Its path is private/var/vm/sleepimage.
    I don't want to delete anything important by accident.
    I just think this is strange and want to find these files but when I click "Open", it says I don't have permission and I have to locate them in Finder. However, when I type "sleepimage" in Finder, nothing comes up.
    I really want to get rid of any files that take up a lot of space on my computer and that shouldn't be there. Do you think this is some kind of virus? And that's why my computer is filling up with some strange files. There are also many swapfile documents that are 1GB in size.

    I think the problem is Quick Time Player. I am using OS 10.6.8 by the way.
    I tried it out again.
    I had about 26 GB of space before I played an .avi video.
    After the movie started to play (it was slow to load), I got a warning message that my startup disk was full. I opened Disk Utility and it told me I had no space left (about 20 MB or something like that was left).
    I watched to the end of the movie.
    Then I quit Quicktime Player.
    And I shut down the computer.
    I turned on the computer and checked Disk Utility. I have about 26 GB of space.
    I think for some reason Quicktime Player uses up a lot of the disk space when it runs movies that are problematic. This movie I was watching did not open with vlc.
    I normally use vlc to watch movies.
    Probably it doesn't use up this much space when the movie is OK.
    I think there's a problem with this particular movie. It is about 50 MB so it's not that big a file. I watch .smi subtitles with it. The Quicktime Player automatically loads the subtitle file in this case because the subtitle file has the same name as the movie file.
    I watched about 10 movie files from the same series that I downloaded together and did not have any problem watching them. I watched them on vlc. But this latest movie had problems I think (as I repeat). It did not load on vlc even though I waited for a minute, and normally movies load instantly on vlc. 
    So I think that movie file caused problems. It made QT player use up a lot of hard disk space.
    I had better avoid watching that movie in the future. I don't want it to wreck my hard drive.
    So to sum up, everything is back to normal. I have the hard disk space that I should have. However, I am scared of damaging my hard drive due to shonky movie files that eat up all the space when they are played using QT player, and so I will not watch those files in the future, as I do not know why the files do that.

  • How do i find a file in finder exported to youtube using motion 5?

    i just tried to upload a video from motion 5 to youtube. it transcoded/exported the file and failed uploading to youtube but i can't find the file. how do i find the file so i can upload it manually?
    side note, a second question, this is the only way i could export the video that would make it a normal size (650MB), every other regular "export movie" selection ended up being 10GB. what settings do i use to make sure it's a normal size? the video really isn't that high quality and i have no idea why it would be that big.
    any help is appreciated.

    What do you mean by "normal" size?  What is the frame size, frame rate, and codec you're editing in the Timeline?

  • Can't move big files also with Mac OS Extended Journaled!

    Hi all =)
    This time I can't really understand.
    I'm trying to move a big file (an app 9 GB large) from my macbook app folder to an external USB drive HFS+ formatted, but at about 5.5 GB the process stop and I get this error:
    "The Finder can't complete the operation because some data in "application.app" can't be read or written. (error code -36)"
    Tried to search for this error code over the internet with no results.
    Tried the transfer of the same file to a different drive (which is also HFS) but I still get the same error code.
    Both drives have plenty of available free space.
    Tried also  different USB cables.
    The app in subject has just been fully downloaded with no error from the app store.
    What should I try now? Please any suggestion welcome.. this situation it's so frustrating!
    Thanks

    LowLuster wrote:
    The Applications folder is a System folder and there are restrictions on it. I'm not sure why you are trying to copy an App out of it to an external drive. Try copying it to a folder on the ROOT of the internal drive first, a folder you create using a Admin account, and then copying that to the external.
    Thanks a lot LowLust, you actually solved my situation!
    But apparently in this "forum" you can't cancel the choosen answer if by mistake you clicked on the wrong one and you can't edit your messages and you even can't delete your messages... such a freedom downhere, jeez!

  • "system cannot find the file specified" when launching a java app

    Hi,
    we are receiving an error message saying "The system cannot find the file specified" when launching a java app on a server. Exactly the same configuration on another server does not give the same problem.
    The dialog box with the error message only presents one button - OK - and after pressing the button the app continues to run normally. This however is a big problem for us as we are trying to convert the apps into system services using the wrapper (http://wrapper.tanukisoftware.org).
    Note: the problem exibits itself with and without the use of the wrapper.
    I am not very comfortable about reinstalling the JRE on a production server. Is there any way to see what file it is missing? Debug options or something similar?
    The JRE in question is 1.4.2_06. Many thanks in advance for your answers.

    if not setting Post-Processing option results in a cryptic error why doesn't Adobe do one of at least three things?
    1. Don't let user click Export on Export dialog if nothing's selected for Post-Processing.
    2. Default Post-Processing to Do Nothing
    3. Give the user a hint as to what the problem is when the error occurs instead of issuing cryptic error message.
    Allowing this error to persist for so long is ridiculous.

  • IBook need a diet. How to find biggest files/folders?

    Hi -- this may seem like a silly question, but is there a way to search for the largest files/folders on my iBook (latest version of Tiger)? I'd like to find the offending bloat and get rid of it.
    A couple of weeks ago, I had 20GB left, now I've got 14GB, and I don't know what I've done to get there. I've done the "get info" thing on various folders, but haven't found any that are particularly bloated.
    Is there an app (preferably freeware, natch.) that can search for apps/files/folders by size? Or is this a capability that's already in Tiger?
    Thanks!

    You can also use Terminal to find big directories.
    The following command finds the top-10 directories:
    du -S | sort -nr | head
    Please note that I am using the fink version of du.
    Dual G5 @ 2GHz   Mac OS X (10.4.5)  

  • Finding largest files

    Hi Guys,
    I have a situation where i  have multiple home drives on a share. i want to list their largest (size) files with their parent folder names (this usually is their username), so i know which user is consuming more space and also it would be good to point
    out their largest files so they can either delete or other arrangements are organised. It would be nice to get a spreadsheet with the results.
    I know it can be done with each of those homedrives which is time consuming task and i know powershell can do that quicker so i hope some has an answer! 
    -mEtho

    Thanks very much for the replies. Following is the script which basically scans all the "Home Drives" and produces a nice list with the name/location and size. with that i can find out who is using most of the space.  
    function Get-FolderLength
    # Usage: Get-FolderLength.ps1 D:\Homedirs
    param ( [Parameter(Mandatory=$true)] [String]$Path )
    $FileSystemObject = New-Object -com Scripting.FileSystemObject
    $folders = (Get-Childitem $path | ? {$_.Attributes -eq "Directory"})
    foreach ($folder in $folders)
    $folder | Add-Member -MemberType NoteProperty -Name "SizeMB" –Value(($FileSystemObject.GetFolder($folder.FullName).Size) / 1MB)
    $folders | sort -Property SizeMB -Descending | select fullname,@{n=’Size MB’;e={"{0:N2}" –f $_.SizeMB}}
    That scripts only scans the Directories. Next i would like to scan each of those home drives and list only top 10 or 20 biggest files. so i can inform users about those big files. 
    Example: 
    Directory 1 
    Directory 2
    Directory 3
    Directory 4
    i want to scan each of those directories to find which files are using most space. A nice output with directories names and most space consuming files would be great (csv).  
    Any help would be much appreciated!

  • Finding unnecessary files?

    Hey guys, I just wanted to know how I can find unnecessary files, particularly large files that's on my Mac HDD. I don't trust 3rd party apps so I don't want to use one. I just want to do it myself or using something within OSX Mavericks.
    Thanks!

    OmniDiskSweeper worth the download (and it is free).
    NOTE:  Stay in your home directory tree.  If you start deleting system files, you may end up turning your system into a Door Stop.  If you think there is a file that you feel you should be able to delete that is not in your home directory tree, research it with Google, or ask in these forums.
    If you insist on just using Mac OS X features, then you could enable Finder -> View -> Show View Options -> Calculate all sizes.
    This will show the amount of stored in each directory in "List" mode, and you can open that directory, and see the sizes of each file and directory, and open the next level of directory and see sizes of files and directories, etc...
    You chase the larger directories looking for large files you own and your know you can safely delete.
    NOTE:  "Calculate all Sizes" can be very slow and can slow down your Mac if you leave it on all the time, so turn it off when you are done looking for big files.

  • Finding large files

    Hi,
    Over the last couple of weeks I seem to have lost 30gigs of space on my home drive. I'm now down to 6G free and its starting to hang. (snow leopard with 300gig drive).
    Is there a utility for visualizing which folders have largest files, or for finding big new files. I don't know where the space has gone and have looked into a few obvious folders, but haven't found anything obvious to delete or clean up...
    Any ideas or utilities to recommend?
    Eric

    Try OmniDiskSweeper or WhatSize - VersionTracker or MacUpdate.

  • How to parse a big file with Regex/Patternthan

    I would parse a big file by using matcher/pattern so i have thought to use a BufferedReader.
    The problem is that a BufferedReader constraints to read
    the file line by line and my patterns are not only inside a line but also at the end and at the beginning of each one.
    For example this class:
    import java.util.regex.*;
    import java.io.*;
    public class Reg2 {
      public static void main (String [] args) throws IOException {
        File in = new File(args[1]);
        BufferedReader get = new BufferedReader(new FileReader( in ));
        Pattern hunter = Pattern.compile(args[0]);
        String line;
        int lines = 0;
        int matches = 0;
        System.out.print("Looking for "+args[0]);
        System.out.println(" in "+args[1]);
        while ((line = get.readLine()) != null) {
          lines++;
          Matcher fit = hunter.matcher(line);
          //if (fit.matches()) {
          if (fit.find()) {
         System.out.println ("" + lines +": "+line);
         matches++;
        if (matches == 0) {
          System.out.println("No matches in "+lines+" lines");
      }used with the pattern "ERTA" and this file (genomic sequence) :
    AAAAAAAAAAAERTAAAAAAAAAERT [end of line]
    ABBBBBBBBBBBBBBBBBBBBBBERT [end of line]
    ACCCCCCCCCCCCCCCCCCCCCCERT [end of line]
    returns it has found the pattern only in this line
    "1: AAAAAAAAAAAERTAAAAAAAAAERT"
    while my pattern is present 4 times.
    Is really a good idea to use a BufferedReader ?
    Has someone an idea ?
    thanx
    Edited by: jfact on Dec 21, 2007 4:39 PM
    Edited by: jfact on Dec 21, 2007 4:43 PM

    Quick and dirty demo:
    import java.io.*;
    import java.util.regex.*;
    public class LineDemo {
        public static void main (String[] args) throws IOException {
            File in = new File("test.txt");
            BufferedReader get = new BufferedReader(new FileReader(in));
            int found = 0;
            String previous = "", next = "", lookingFor = "ERTA";
            Pattern p = Pattern.compile(lookingFor);
            while((next = get.readLine()) != null) {
                String toInspect = previous+next;
                Matcher m = p.matcher(toInspect);
                while(m.find()) found++;
                previous = next.substring(next.length()-lookingFor.length());
            System.out.println("Found '"+lookingFor+"' "+found+" times.");
    /* test.txt contains these four lines:
    AAAAAAAAAAAERTAAAAAAAAAERT
    ABBBBBBBBBBBBBBBBBBBBBBERT
    ACCCCCCCCCCCCCCCCCCCCCCERT
    ACCCCCCCCCCCCCCCCCCCCCCBBB
    */

  • Question about reading a very big file into a buffer.

    Hi, everyone!
    I want to randomly load several characters from
    GB2312 charset to form a string.
    I have two questions:
    1. Where can I find the charset table file? I have used
    google for hours to search but failed to find GB2312 charset
    file out.
    2. I think the charset table file is very big and I doubted
    whether I can loaded it into a String or StringBuffer? Anyone
    have some solutions? How to load a very big file and randomly
    select several characters from it?
    Have I made myself understood?
    Thanks in advance,
    George

    The following can give the correspondence between GB2312 encoded byte arrays and characters (in hexadecimal integer expression).
    import java.nio.charset.*;
    import java.io.*;
    public class GBs{
    static String convert() throws java.io.UnsupportedEncodingException{
    StringBuffer buffer = new StringBuffer();
    String l_separator = System.getProperty("line.separator");
    Charset chset = Charset.forName("EUC_CN");// GB2312 is an alias of this encoding
    CharsetEncoder encoder = chset.newEncoder();
    int[] indices = new int[Character.MAX_VALUE+1];
    for(int j=0;j<indices.length;j++){
           indices[j]=0;
    for(int j=0;j<=Character.MAX_VALUE;j++){
        if(encoder.canEncode((char)j)) indices[j]=1;
    byte[] encoded;
    String data;
    for(int j=0;j<indices.length;j++) {
         if(indices[j]==1) {
                encoded =(Character.toString((char)j)).getBytes("EUC_CN");
                          for(int q=0;q<encoded.length;q++){
                          buffer.append(Byte.toString(encoded[q]));
                          buffer.append(" ");
                buffer.append(": 0x");buffer.append(Integer.toHexString(j));
                buffer.append(l_separator);
        return buffer.toString();
    //the following is for testing
    /*public static void main(String[] args) throws java.lang.Exception{
       String str = GBs.convert();
       System.out.println(str);*/

  • Big File vs Small file Tablespace

    Hi All,
    I have a doubt and just want to confirm that which is better if i am using Big file instead of many small datafile for a tablespace or big datafiles for a tablespace. I think better to use Bigfile tablespace.
    Kindly help me out wheather i am right or wrong and why.

    GirishSharma wrote:
    Aman.... wrote:
    Vikas Kohli wrote:
    With respect to performance i guess Big file tablespace is a better option
    Why ?
    If you allow me to post, I would like to paste the below text from my first reply's doc link please :
    "Performance of database opens, checkpoints, and DBWR processes should improve if data is stored in bigfile tablespaces instead of traditional tablespaces. However, increasing the datafile size might increase time to restore a corrupted file or create a new datafile."
    Regards
    Girish Sharma
    Girish,
    I find it interesting that I've never found any evidence to support the performance claims - although I can think of reasons why there might be some truth to them and could design a few tests to check. Even if there is some truth in the claims, how significant or relevant might they be in the context of a database that is so huge that it NEEDS bigfile tablespaces ?
    Database opening:  how often do we do this - does it matter if it takes a little longer - will it actually take noticeably longer if the database isn't subject to crash recovery ?  We can imagine that a database with 10,000 files would take longer to open than a database with 500 files if Oracle had to read the header blocks of every file as part of the database open process - but there's been a "delayed open" feature around for years, so maybe that wouldn't apply in most cases where the database is very large.
    Checkpoints: critical in the days that a full instance checkpoint took place on the log file switch - but (a) that hasn't been true for years, and (b) incremental checkpointing made a big difference the I/O peak when an instance checkpoint became necessary, and (c) we have had a checkpoint process for years (if not decades) which updates every file header when necessary rather than requiring DBWR to do it
    DBWR processes: why would DBWn handle writes more quickly - the only idea I can come up with is that there could be some code path that has to associate a file id with an operating system file handle of some sort and that this code does more work if the list of files is very long: very disappointing if that's true.
    On the other hand I recall many years ago (8i time) crashing a session when creating roughly 21,000 tablespaces for a database because some internal structure relating to file information reached the 64MB hard limit for a memory segment in the SGA. It would be interesting to hear if anyone has recently created a database with the 65K+ limit for files - and whether it makes any difference whether that's 66 tablespaces with about 1,000 files, or 1,000 tablespace with about 66 files.
    Regards
    Jonathan Lewis

Maybe you are looking for

  • Need suggestion on Performance tuning

    Hi, We have a container with the single document which is of size 3 MB around. This document contains some master data, it will be used across the application. we have implemented transactions for the whole application recently,so that we could avoid

  • Reg. displaying the comparison BOM fields in a single line in CS14

    Dear Experts, My requirement is that whenever CS14 is executed for Primary BOM with Secondary BOM comparison, when the Differentiated/Summarized Comparison button is selected, the comparison BOM Fields should appear in a single line. Currently SAP st

  • Lync Client 2013 not Signing in, But Lync 2010 Do Sign In

    Dears, I have a weird problem, I have installed Lync 2013 FE Server, after I finished installing it I tried to test the log in with the Lync 2013 client, it asked for authentication but everytime I enter the authentication it gives me an error of ser

  • Can't use internet

    Hello, I am connected to the wifi on my iPhone, it says it is in settings and it has the blue signal with 3 bars going up. Also, the main phone signal is at its strongest. So, I'm connected etc. But, when I try to use an application that requires the

  • Crash - close project, open new project

    Logic 9.01 on Nehalem dual quad core w 16G ram (not that either of those help as of yet). When I am working on a project thats pretty RAM intensive - right on the 4G mark - and then close it, and open another - Logic crashes. Anyone else seeing this?