Two Big Files on my HD

I was cleaning up my Mac when I found two very large files:
- Windowserver_Last.log
- Sleepimage
Can I just delete them by dragging them to the trash?

Shozen
There are no dumb questions, only dumb answers!
how to I get to look at it using the Console application?
You'll find the Console Application in /Applications/Utilities. Double click on it, and you'll get one or two windows, showing the console.log and (probably) the system.log. At the top of the window, in the toolbar, there is a "Logs" icon. Click on this to reveal a sidebar with all available logs. Click on the disclosure triangle alongside /var/log and scroll down to the appropriate log.
Due to a bug, you'll probably see lots of these messages repeating:
<pre>Jul 23 00:53:50 [267] "loginwindow" (0x3703) set hot key operating mode to all disabled
Jul 23 00:53:50 [267] Hot key operating mode is now all disabled
Jul 23 12:18:42 [267] "loginwindow" (0x3703) set hot key operating mode to normal
Jul 23 12:18:42 [267] Hot key operating mode is now normal</pre>You can disregard these: although they are tedious, they don't occur often enough to make those files that big. Look for other repeating messages. You may see a few kCGErrorIllegalArgument : … messages, but unless they are repeating quickly they are also harmless.

Similar Messages

  • Adobe Photoshop CS3 collapse each time it load a big file

    I was loading a big file of photos from iMac iPhoto to Adobe Photoshop CS3 and it keep collapsing, yet each time I reopen photoshop it load the photos again and collapse again. is there a way to stop this cycle?

    I don't think that too many users here actually use iPhoto (even the Mac users)
    However, Google is your friend. A quick search came up with some other non-Adobe forum entries:
    .... but the golden rule of iPhoto is NEVER EVER MESS WITH THE IPHOTO LIBRARY FROM OUTSIDE IPHOTO.In other words, anything you might want to do with the pictures in iPhoto can be done from *within the program,* and that is the only safe way to work with it. Don't go messing around inside the "package" that is the iPhoto Library unless you are REALLY keen to lose data, because that is exactly what will happen.
    .....everything you want to do to a photo in iPhoto can be handled from *within the program.* This INCLUDES using a third-party editor, and saves a lot of time and disk space if you do this way:
    1. In iPhoto's preferences, specify a third-party editor (let's say Photoshop) to be used for editing photos.
    2. Now, when you right-click (or control-click) a photo in iPhoto, you have two options: Edit in Full Screen (ie iPhoto's own editor) or Edit with External Editor. Choose the latter.
    3. Photoshop will open, then the photo you selected will automatically open in PS. Do your editing, and when you save (not save as), PS "hands" the modified photo back to iPhoto, which treats it exactly the same as if you'd done that stuff in iPhoto's own editor and updates the thumbnail to reflect your changes. Best of all, your unmodified original remains untouched so you can always go back to it if necessary.

  • Trying to edit the .js file again, but...I now have TWO .js files

    RE: Putting a song into my iweb page, following Mike Wong's instructions with considerable and valuable help along the way from James Tseng.
    I have spent all day working at this and failing countless times.
    I finally got to the point where I was supposed to edit/change the .js file with the changes in height from 32 to 16, the controllers from true to false, the autoplays from false to true, and the loop from false to true.
    Although I was not totally clear on what was going on, I slogged through it and had to re-do it and still have to re-do it because thus far the song is not on the page. Meanwhile, due to a townhall meeting that I went to, the site will be getting more visitors, and I want a song on that page.
    MY QUESTION AT THIS POINT:
    I have TWO .js files for that particular page in my iDisk. Shouldn't I have just ONE .js file? I started out with only one .js file and I worked on it and have no idea how TWO .js files got there with the .html files and all the others.
    I was under the impression that if I changed something on that particular page and then republished, I would end up with a brand new fresh, up to date .js file. But No. I still have TWO .js files and at this point when I have to go back and re-do the information changes that I listed above, I do not know which .js file to pick.
    Beyond that, I am disturbed (1) that I have TWO of those .js files, and (2) why, when I changed the page and republished, I did not end up with just ONE .js file, new and up to date and unburdened by the other two pages that may be wrong or who knows what.
    If this is an omen that I should just quit tryingto put another song on my page, then maybe for now I will just relax and give it a rest. I do have ONE song on my page, but that was with a lot of hand-holding help and although I did take notes, I apparently did not take sufficiently good notes to see me through a solitary run.
    I would appreciate any input on the issue of why I have two .js files and why republishing did not resolve said problem.
    I don't feel that I should try to edit html on the wrong page.
    Lorna in Southern California

    Hi Lorna,
    Somehow you duplicated your js file. No big deal;
    but I don't know which one to tell you to delete.
    Would it be a big deal to just republish your site
    and start over?
    .......... Lorna says ................................................
    I just now did that. You can read my report in the post.
    When you edit your js file with an editor, all you
    will do is "Save" the changes. You're not going to
    "Republish" the changes. Just click "Save" and
    that's it.
    .......... Lorna says ................................................
    And that is exactly what i did! I clicked Save. Didn't get me my song on the page, though.
    Assuming you have successfully converted/compressed
    the audio file and that part of it is done, for now,
    let's just worry about changing SIX WORDS in the js
    file. That's all you need to do...republish the
    thing and well get it figured out.
    .......... Lorna says ................................................
    I did. I selected the .js file. I opened it with Text Edit (learning so many new things.) I went thru all the text between thet two words <object and object>. And I clicked Save.
    No song.
    Lorna in Southern California

  • Question about reading a very big file into a buffer.

    Hi, everyone!
    I want to randomly load several characters from
    GB2312 charset to form a string.
    I have two questions:
    1. Where can I find the charset table file? I have used
    google for hours to search but failed to find GB2312 charset
    file out.
    2. I think the charset table file is very big and I doubted
    whether I can loaded it into a String or StringBuffer? Anyone
    have some solutions? How to load a very big file and randomly
    select several characters from it?
    Have I made myself understood?
    Thanks in advance,
    George

    The following can give the correspondence between GB2312 encoded byte arrays and characters (in hexadecimal integer expression).
    import java.nio.charset.*;
    import java.io.*;
    public class GBs{
    static String convert() throws java.io.UnsupportedEncodingException{
    StringBuffer buffer = new StringBuffer();
    String l_separator = System.getProperty("line.separator");
    Charset chset = Charset.forName("EUC_CN");// GB2312 is an alias of this encoding
    CharsetEncoder encoder = chset.newEncoder();
    int[] indices = new int[Character.MAX_VALUE+1];
    for(int j=0;j<indices.length;j++){
           indices[j]=0;
    for(int j=0;j<=Character.MAX_VALUE;j++){
        if(encoder.canEncode((char)j)) indices[j]=1;
    byte[] encoded;
    String data;
    for(int j=0;j<indices.length;j++) {
         if(indices[j]==1) {
                encoded =(Character.toString((char)j)).getBytes("EUC_CN");
                          for(int q=0;q<encoded.length;q++){
                          buffer.append(Byte.toString(encoded[q]));
                          buffer.append(" ");
                buffer.append(": 0x");buffer.append(Integer.toHexString(j));
                buffer.append(l_separator);
        return buffer.toString();
    //the following is for testing
    /*public static void main(String[] args) throws java.lang.Exception{
       String str = GBs.convert();
       System.out.println(str);*/

  • Adobe Stop working when I open two PDF Files.

    I am using the Adobe Acrobat 11.0.03 on my desktop running win-8. The big issue for me is that the adobe just stop working, when I tried to open two PDF files at the same time. But it is fine when I just open one. So could any one give me same advise? Thank you very much.

    Hi, Stacy:
    I am using the Adobe Acrobat XI. Currently, it works perfectly when I only open one PDF file. If I try to open two files at the same time. the Adobe Acrobat just stop working and shows "Adobe Acrobat has stopped working".
    It is so frastrating becasue I always need to open and edit multiple PDF file at the same time.
    I really hope I can get some helps.
    Thank you, Stacy.

  • What Is A big file?

    Hi All. I'm making a smart folder for big files when the question hit me. When does a file become a big file? 100MB? 500MB? 1GB? I would love to hear anyone's opinions.

    It's all relative. Its fair to define the relative size of a file in terms of entropy and utility, both of which are measurable (entropy empirically, utility through observation).
    The entropy is a measure of the randomness of the data. If it's regular, it should be compressible. High entropy (1 per bit) means a good use of space, low entropy (0 per bit), a waste.
    Utility is simple, what fraction of the bits in a file are ever used for anything. Even if it's got a high entropy, if you never refer to a part of the file, it's existence is useless.
    You can express the qualitative size of a file as the product of the entropy and the utility. Zero entropy means the file itself contains just about no information -- so, regardless how big it is it's larger than it needs to be. If you never access even one bit of a file, it's also too big regardless how many bits are in the file. If you use 100% (1 bit per bit) of a file, but it's entropy is 0.5 per bit, then the file is twice as big as it needs to be to represent the information contained in it.
    An uncompressed bitmap is large, whereas a run-length encoded bitmap that represents the exact same data is small.
    An MP3 file is based on the idea that you can throw away information in a sound sample to make it smaller, but still generate something that sounds very similar to the original. Two files representing the same sound, but the MP3 is smaller because you can sacrifice some of the bits because you are lowering the precision with which you represent the original (taking advantage that human perception of the differences is limited).
    So, 100M would seem like a ridiculous size for a forum post, but it sounds small for a data file from a super-high resolution mass spectrometer.

  • Working with big files

    I would appreciate any suggestion as to how best one can work with big files.  I have eight one hour tapes containing information about various places I visited.  Some contain only one place while others can contain two or more places. As a preference I like to create up to one hour DVDs that contain an identifiable segment; eg either one place only or more but I do try not to break up a place and record it on two DVDs.  However, the problem I faced in the past, is that by the time I get to the end of an hour, the program Premiere Elements 7 (PE7) is getting slow and keeps telling me that it is running out of memory. That is despite the fact that I have 3 GM of RAM and over 200 GB of dedicated separate hard drive.  I tried saving segments under ‘Share’, ‘DV AVI’ and selecting ‘DV PAL Standard’ (the system I use), but the result, as far as quality was concerned, was very disappointing. There is no option to change or set quality as was the case in previous versions of PE.  Nor can I combine two .prel files into one while I am editing. 
    Is there any other way I can work in segments, say 15 minutes, and combine them at the end without losing on the quality? 
    Any suggestion would be most appreciated.
    Tom 

    Hi Steve
    It is a Sony DCR-HC40E PAL, and they were captured as AVI clips.
    The project settings are:
    General – not highlighted but it shows DV Pal and 25.00 frame/second
    Video – 720 horizontal 576vertical (4/3)
    Display format - 25 fps Timecode
    Capture – DV Capture
    Tom

  • Problem reading big file. No, bigger than that. Bigger.

    I am trying to read a file roughly 340 GB in size. Yes, that's "Three hundred forty". Yes, gigabytes. (I've been doing searches on "big file java reading" and I keep finding things like "I have this huge file, it's 600 megabytes!". )
    "Why don't you split it, you moron?" you ask. Well, I'm trying to.
    Specifically, I need a slice "x" rows in. It's nicely delimited, so, in theory:
    (pseudocode)
    BufferedFileReader fr=new BufferedFileReader(new FileReader(new File(myhugefile)));
    int startLine=70000000;
    String line;
    linesRead=0;
    while ((line=fr.ReadLine()!=null)&&(linesRead<startLine))
    linesRead++; //we don't care about this
    //ok, we're where we want to be, start caring
    int linesWeWant=100;
    linesRead=0;
    while ((line=fr.ReadLine()!=null)&&(linesRead<linesWeWant))
    doSomethingWith(line);
    linesRead++'
    (Please assume the real code is better written and has been proven to work with hundreds of "small" files (under a gigabyte or two). I'm happy with my file read/file slice logic, overall.)
    Here's the problem. No matter how I try reading the file, whether I start with a specific line or not, whether I am saving out a line to a string or not, it always dies with an OEM at around row 793,000,000. the OEM is thrown from BufferedReader->ReadLine. Please note I'm not trying to read the whole file into a buffer, just one line at a time. Further, the file dies at the same point no matter how high or low (with reason) I set my heap size, and watching the memory allocation shows it's not coming close to filling memory. I suspect the problem is occurring when I've read more than int bytes into a file.
    Now -- the problem is that it's not just this one file -- the program needs to handle a general class of comma- or tab- delimited files which may have any number of characters per row and any number of rows, and it needs to do so in a moderately sane timeframe. So this isn't a one-off where we can hand-tweak an algorithm because we know the file structure. I am trying it now using RandomAccessFile.readLine(), since that's not buffered (I think...), but, my god, is it slow... my old code read 79 million lines and crashed in under about three minutes, the RandomAccessFile() code has taken about 45 minutes and has only read 2 million lines.
    Likewise, we might start at line 1 and want a million lines, or start at line 50 million and want 2 lines. Nothing can be assumed about where we start caring about data or how much we care about, the only assumption is that it's a delimited (tab or comma, might be any other delimiter, actually) file with one record per line.
    And if I'm missing something brain-dead obvious...well, fine, I'm a moron. I'm a moron who needs to get files of this size read and sliced on a regular basis, so I'm happy to be told I'm a moron if I'm also told the answer. Thank you.

    LizardSF wrote:
    FWIW, here's the exact error message. I tried this one with RandomAccessFile instead of BufferedReader because, hey, maybe the problem was the buffering. So it took about 14 hours and crashed at the same point anyway.
    Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space
         at java.util.Arrays.copyOf(Unknown Source)
         at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
         at java.lang.AbstractStringBuilder.append(Unknown Source)
         at java.lang.StringBuffer.append(Unknown Source)
         at java.io.RandomAccessFile.readLine(Unknown Source)
         at utility.FileSlicer.slice(FileSlicer.java:65)
    Still haven't tried the other suggestions, wanted to let this run.Rule 1: When you're testing, especially when you don't know what the problem is, change ONE thing at a time.
    Now you've introduced RandomAccessFile into the equation you still have no idea what's causing the problem, and neither do we (unless there's someone here who's been through this before).
    Unless you can see any better posts (and there may well be; some of these guys are Gods to me too), try what I suggested with your original class (or at least a modified copy). If it fails, chances are that there IS some absolute limit that you can't cross; in which case, try Kayaman's suggestion of a FileChannel.
    But at least give yourself the chance of KNOWING what or where the problem is happening.
    Winston

  • WRT160NL copyin Big files on storage wil not succeed

    When i put a big file like a 7 GB (iso image) on my harddisk which is connected at my router, the connection breaks down in the middle of the copying process. I tried two different hardisks but same problem, so i fugures it has to do with the harddisk. Smaller files are no problem. Is there a timed out setting or something?
    Who can help me?
    Solved!
    Go to Solution.

    Actually 7GB is a large amount of data..However try to change the Wireless Chanel on the router and check..
    Which security type you are using on the router..? It is recommended to use WPA2 Security to get the proper speed from WRT160NL router.

  • Not enough space on my new SSD drive to import my data from time machine backup, how can I import my latest backup minus some big files?

    I just got a new 256GB SSD drive for my mac, I want to import my data from time machine backup, but its larger than 256GB since it used to be on my old optical drive. How can I import my latest backup keeping out some big files on the external drive?

    Hello Salemr,
    When you restore from a Time Machine back up, you can tell it to not transfer folders like Desktop, Documents. Downloads, Movies, Music, Pictures and Public. Take a look at the article below for the steps to restore from your back up.  
    Move your data to a new Mac
    http://support.apple.com/en-us/ht5872
    Regards,
    -Norm G. 

  • Photoshop CC slow in performance on big files

    Hello there!
    I've been using PS CS4 since release and upgraded to CS6 Master Collection last year.
    Since my OS broke down some weeks ago (RAM broke), i gave Photoshop CC a try. At the same time I moved in new rooms and couldnt get my hands on the DVD of my CS6 resting somewhere at home...
    So I tried CC.
    Right now im using it with some big files. Filesize is between 2GB and 7,5 GB max. (all PSB)
    Photoshop seem to run fast in the very beginning, but since a few days it's so unbelievable slow that I can't work properly.
    I wonder if it is caused by the growing files or some other issue with my machine.
    The files contain a large amount of layers and Masks, nearly 280 layers in the biggest file. (mostly with masks)
    The images are 50 x 70 cm big  @ 300dpi.
    When I try to make some brush-strokes on a layer-mask in the biggest file it takes 5-20 seconds for the brush to draw... I couldnt figure out why.
    And its not so much pretending on the brush-size as you may expect... even very small brushes (2-10 px) show this issue from time to time.
    Also switching on and off masks (gradient maps, selective color or leves) takes ages to be displayed, sometimes more than 3 or 4 seconds.
    The same with panning around in the picture, zooming in and out or moving layers.
    It's nearly impossible to work on these files in time.
    I've never seen this on CS6.
    Now I wonder if there's something wrong with PS or the OS. But: I've never been working with files this big before.
    In march I worked on some 5GB files with 150-200 layers in CS6, but it worked like a charm.
    SystemSpecs:
    I7 3930k (3,8 GHz)
    Asus P9X79 Deluxe
    64GB DDR3 1600Mhz Kingston HyperX
    GTX 570
    2x Corsair Force GT3 SSD
    Wacom Intous 5 m Touch (I have some issues with the touch from time to time)
    WIN 7 Ultimate 64
    all systemupdates
    newest drivers
    PS CC
    System and PS are running on the first SSD, scratch is on the second. Both are set to be used by PS.
    RAM is allocated by 79% to PS, cache is set to 5 or 6, protocol-objects are set to 70. I also tried different cache-sizes from 128k to 1024k, but it didn't help a lot.
    When I open the largest file, PS takes 20-23 GB of RAM.
    Any suggestions?
    best,
    moslye

    Is it just slow drawing, or is actual computation (image size, rotate, GBlur, etc.) also slow?
    If the slowdown is drawing, then the most likely culprit would be the video card driver. Update your driver from the GPU maker's website.
    If the computation slows down, then something is interfering with Photoshop. We've seen some third party plugins, and some antivirus software cause slowdowns over time.

  • I have two excel files I cannot open or delete from my desktop.  One is titled 6ACAD200 and the other file is titled 8D73A700.  How can I get these off of my computer?

    I have two excel files I cannot open or delete from my desktop.  One is titled 6ACAD200 and the other file is titled 8D73A700.  How can I get these off of my computer?

    From the names I'd guess they are temporary files. If excel is running, quit and try again. If they still won't move or delete reboot. If they still won't move or delete...kill a chicken at midnight under an oak tree?

  • How can I combine two video files into one finished file?

    I'm making a video of a guitar player.  I'm using two cameras simultaneously.  One gives the overall wide shot, and the other is zoomed in on his hands.  Both cameras run at the same time.  I would like to use the wide shot as the main video, and occasionally fade in clips from the zoomed in shot to show details of his playing.  I can manually line up both files so they are synchronized.  How do I cut a clip out of the main (wide) shot, leave the gap there, and insert a clip from the closeup?  I'm putting the two video files on two of the video timelines and synchronizing them that way.
    Any suggestions on how to proceed?

    The effect you are looking for is 'Picture in Picture'. If you want the gap to be there for the duration of the timeline add some 'black video' - from the Tasks Pane> New Item - on the video track above the main track. Resize / position it to where you wish it to be.
    Then with your second clip (now on video track 3)
    Do the same sizing / positioning for it.
    split video and audio and delete or disable the audio.
    Cut it where you want the various start/ends of the fade to be.
    Delete, or just disable (right-click) the unnecessary segments.
    Apply fade in / out to the remaining segments.
    [EDIT]
    Also check out the 'Presets' in Video Effects. There are a variety of PiP presets that may do what you need.
    [/EDIT]
    Cheers,
    Neale
    Insanity is hereditary, you get it from your children

  • How do I combine two PDF files into one?

    I want to merge two PDF files into one to make things easier when I take the file(s) to a professional printer.
    Can I do this in Preview?
    Thanks.

    You can't do in Preview.
    You can combine individual PDFs into one PDF using one of these freeware utilities.
    PDFMergeX @ http://www.malcom-mac.com/blog/pdfmergex/
    joinPDF @ http://www.macupdate.com/info.php/id/16604
    Combine PDFs @ http://www.monkeybreadsoftware.de/Freeware/CombinePDFs.shtml>
    PDFLab (join & split) @ http://pdflab.en.softonic.com/mac
     Cheers, Tom

  • Can I combine two pdf files by using command lines?

    Hi
    I always need to combine two pdf files into one in my regular work
    Currently, I open one of them, press ctrl+shift+I,
    find another file and double click it.
    It works but not so efficient since I need to do
    this procedure many times everyday
    So I'm looking for a better solution
    Something like command lines,
    for example:
    acrobat.exe "combine" "doc1.pdf" "doc2.pdf"
    Does acrobat have these functions or not?
    Thx &
    Best Regards

    File > Combine

Maybe you are looking for