Copied Files To HDD - 'Get Info' File Sizes are different?

Hi All ---
-- not sure how to put it any other way.....maybe this is a normal thing?
I've copied 161.78Gigs of data from the P.B. to Lacie HDD but 'Get Info' on target disk is only reading 161.61Gigs - all files seem to be in place, seem to be copied, working etc --- but thats about 17megs of info missing somewhere?
Uh, what gives?
should i trash, and re-do the whole lot again? hmm.
thnx for any help in Adv ----
John W.

Hi Joe ---
Thanks for that - have been offline since yest. -- seems the main P.B. drive decided to go AWOL on me - but just overnight though? No clue whats going on there -- have repaired disk, permissions, run onyx, etc..... just didnt want to boot, occasionally would boot from Ext. HDD - Drive Icon vanished, all sorts of nonsense.... and then returned this a.m. as if nothing had ever happened.
I reckon HD is on its way out.... hmm.
I'm now getting lots of beachballs. v. boring. Everything is backed up externally TWICE now (& cloned Drive TWICE??) - onto two separate drives - am just going to erase, reformat P.B. before reinstalling 10.3 & 10.3.9 -- off to see a Doctor/Repairman tomorrow.
thanks for the link -- will def. have a go with that in the future.
Cheers,
J.

Similar Messages

  • I have a few hundred duplicates in my iPhoto library, but the file sizes are different.  So one is 1.3mb and one is 567kb.  I want to delete the smaller ones, but short of comparing each duplicate, is there a way to do this?

    I have a few hundred duplicates in my iPhoto library, but the file sizes are different.  So one is 1.3mb and one is 567kb.  I want to delete the smaller ones, but short of comparing each duplicate, is there a way to do this?  I've been looking at Duplicate Annhilator but I don't think it can do it.
    Thanks!

    I just ran a test with iPhoto Library Manager, Duplicate Annihilator, iPhoto Duplicate Cleaner, Duplifinder and Photodedupo.  I imported a folder of 5 photos into a test library 3 times, allowing iPhoto to import duplicates.  I then ran the 5 photos thru resizer to reduce their jpeg compression but all other aspects of the file the same.
    None of the duplicate removal apps found set that was reduced in the file resizer. That's probably due to the fact that the file creation date was being used as a criteria and the resized photo would have a different file creation date even though the Image Capture date was the same.
    They all found the 3 regular duplicates and some of them would mark two and leave the 3rd unmarked.  iPhoto Duplicate Cleaner can sort the found duplicates by file size but if the file was edited to get the reduced file size it might not be found as it would have a different file creation/modification date. 
    iPhoto Library Manage was able to find all duplicates and mark them as such if the file names were the same the the filename option was selected.  Otherwise it also missed the modified, resized version.  It allowed one to select the one photo to save before going to work on the library.
    So if a photo has been reduced in image quality or pixel size it will not be considered a duplicate.
    OT

  • File SIZES are Different ?

    Hi Experts,
    I have implemented ESR By pass Scenario.
    The problem is we are getting the file sizes are differently in Source folder and Target folder
    How can we resolve this issue
    Please Guide Me
    Thanks and Regards,
    Ravi Teja.

    Refer to question no. 2 under the below wiki and see if that configuration helps.
    Sender File Adapter Frequently Asked Questions - Process Integration - SCN Wiki

  • After duplicate operation, file sizes(Checkpoint file size) are different

    HI
    I have a some questions.
    We are testing a 4-way Replication. After duplicate operation, file sizes(Checkpoint file size) are different in OS command(du -sh).
    Is the normal?
    TimesTen Version : TimesTen Release 7.0.5.0.0 (64 bit Solaris)
    OS Version : SunOS 5.10 Generic_141414-02 sun4u sparc SUNW,SPARC-Enterprise
    [TEST17A] side
    [TEST17A] /mmdb/DataStore # du -sh ./*
    6.3G ./SAMPLE
    410M ./SAMPLE_LOG
    [TEST17A] /mmdb/DataStore/SAMPLE # ls -lrt
    total 13259490
    -rw-rw-rw- 1 timesten other 501 Aug 14 2008 SAMPLE.inval
    -rw-rw-rw- 1 timesten other 4091428864 Jan 29 02:13 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4113014784 Jan 29 02:23 SAMPLE.ds0
    [TEST17A] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize ;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36991
    PERM_IN_USE_HIGH_WATER: 36991
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5864
    TEMP_IN_USE_HIGH_WATER: 6757
    [TEST17B] side
    [TEST17B] /mmdb/DataStore # du -sh ./*
    911M ./SAMPLE
    453M ./SAMPLE_LOG
    [TEST17B] /mmdb/DataStore/SAMPLE # ls -lrt
    total 1865410
    -rw-rw-rw- 1 timesten other 334 Dec 11 2008 SAMPLE.inval
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:25 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:25 SAMPLE.ds0
    [TEST17B] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 432128
    PERM_IN_USE_HIGH_WATER: 432128
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5422
    TEMP_IN_USE_HIGH_WATER: 6630
    [TEST18A] side
    [TEST18A] /mmdb/DataStore # du -sh ./*
    107M ./SAMPLE
    410M ./SAMPLE_LOG
    [TEST18A] /mmdb/DataStore/SAMPLE # ls -lrt
    total 218976
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:22 SAMPLE.ds0
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:32 SAMPLE.ds1
    [TEST18A] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36825
    PERM_IN_USE_HIGH_WATER: 37230
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 6117
    TEMP_IN_USE_HIGH_WATER: 7452
    [TEST18B] side
    [TEST18B] /mmdb/DataStore # du -sh ./*
    107M ./SAMPLE
    411M ./SAMPLE_LOG
    [TEST18B] /mmdb/DataStore/SAMPLE # ls -lrt
    total 218976
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:18 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:28 SAMPLE.ds0
    [TEST18B] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36785
    PERM_IN_USE_HIGH_WATER: 37140
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5927
    TEMP_IN_USE_HIGH_WATER: 7199
    Thank you very much.
    GooGyum

    You don't really give much detail on what operations were performed and in what sequence (e.g. duplicate from where to where...) nor if there was any workload running when you did the duplicate. In general checkpoint file sizes amongst replicas will not be the same / remain the same because:
    1. Replicas are logical replicas not physical replicas. Replication transfers logical operations and applies logical operations and even if you try and do exactly the same thing at both sides in exactly the same order there are internal operations etc. that are not necessarily synchronised which will cause the size of the files to vary somewhat.
    2. The size of the file as reported by 'ls -l' represents the maximum offset that has so far been written to in the file but the current 'usage' of the file may be less than this at present.
    3. Checkpoint files are 'sparse' files (unless created with PreAllocate=1) and so the space used as reported by 'du' will in general not correspond ot the size of the file as reported by 'ls -l'.
    Unless you are seeing some kind of problem I would not be concerned at an apparent difference in size.
    Chris

  • FCP file sizes are twice the size of Quicktime's - any advantage to FCP?

    When I capture footage from my Canon HV40 MiniDV using FCP, the file sizes are almost twice the size as when I capture using Quicktime's Device Native setting. Is there any advantage to capturing with FCP, as the quality seems the same in QT? Or is there a way to reduce the file size being captured in FCP?
    I'd like to be able to capture in FCP because if there is a dropped frame in the recording, Quicktime stops recording, whereas FCP keeps going and starts a new file. But the larger file size in FCP puts me off.

    Not all codecs are editable...meaning, not all work in an editing application. I'm not sure what "device native" captures as in QT. Does it say in GET INFO? But FCP captures HDV and DV native, but in a native format that it can work with. HDV comes in as HDV...Dv as DV.
    I wouldn't trust what QT is capturing as DEVICE NATIVE. And 13GB/hour of footage is not that much. That's the lowest data rate format out there.
    Shane

  • Why the files my program create are created twice each file double ? And why sometimes the files size are too small ?

    My program is using Queue of type Uri to create Queue of urls and then i'm using webbrowser to navigate each Uri from the Queue in it's turn and get the url(html) source content and save it to the hard disk.
    The problem is sometimes the text files on the hard disk are small like 90KB or 60KB and sometimes they are as they are suppose to be 300KB or 200KB.
    This is a button click event where i'm calling two methods:
    private void toolStripButton3_Click(object sender, EventArgs e)
    GetHtmls();
    CheckQueue();
    This is the GetHtmls method code:
    private Queue<Uri> myUrls = new Queue<Uri>();
    private bool isBusy = false;
    private void GetHtmls()
    for (int i = 1; i < 49; i++)
    adrBarTextBox.Text = sourceUrl + i;
    targetHtmls = (combinedHtmlsDir + "\\Html" + i + ".txt");
    Uri targetUri = new Uri(sourceUrl + i);
    myUrls.Enqueue(targetUri);
    sourceUrl contain website address: http://www.tapuz.co.il/forums2008/forumpage.aspx?forumid=393&pagenumber=
    And i'm adding to it the numbers and create the pages.
    And add them to the Queue.
    THen the CheckQueue method:
    Uri uri;
    private void CheckQueue()
    if (isBusy)
    return; // We're downloading some page right now, don't disturb
    isBusy = true; // OK, let's get started
    if (myUrls.Count == 0) // No more pages to download, we're done
    isBusy = false;
    return;
    uri = myUrls.Dequeue(); // Get one URL from queue
    getCurrentBrowser().Navigate(uri);
    It suppose to Navigate to each Uri(html address) in the Queue.
    And the browser document completed event:
    private void Form1_DocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e)
    // If page loaded completly then do something
    int urlnumber = uri.ToString().IndexOf("pagenumber=");
    string number = uri.ToString().Substring(urlnumber + 11);
    int num = Int32.Parse(number);
    targetHtmls = (combinedHtmlsDir + "\\Html" + num + ".txt");
    StreamWriter writer = File.CreateText(targetHtmls);
    writer.Write(getCurrentBrowser().DocumentText);
    writer.Close();
    isBusy = false; // We're done
    CheckQueue(); // Check next page in queue
    In the completed event i'm getting the page number and build the string for the text file then write the html source content to the text file.
    In the end i have on my hard disk 48 text files.
    The problems are:
    1. Sometimes it seems like it's not finishing navigating to the current uri or maybe some other reason maybe server side problem and creating small size files with source content inside but not all the source content. Sometimes the text files size are each
    file 99KB or 70KB and sometimes the size of them are about 300KB and 200KB and this is the right sizes 300KB 200KB.
    2. The text files on my hard disk suppose to be 48 different files each file should contain the source if the html page of the 48 pages. But on my hard disk the 48 text files are duplicated for some reason.
    This is the file on my hard disk:
    Some of the files are 205KB 350KB 175KB and some of the files sizes are 85KB 94KB 35KB 
    Why some of the files it didn't navigated to the end or maybe didn't got all the source ?
    And why it's making each second file the same like the one before ? It suppose to create 48 different files but i'm getting two identical files each navigation.

    I solved it now.
    This is what i did:
    It's a bit slow process since i'm waiting for each page to be loaded into the webbrowser but it does the work.
    using System;
    using System.Collections.Generic;
    using System.ComponentModel;
    using System.Data;
    using System.Drawing;
    using System.Linq;
    using System.Text;
    using System.Threading.Tasks;
    using System.Windows.Forms;
    using System.Collections;
    using System.IO;
    using System.Net;
    namespace WindowsFormsApplication1
    public partial class Form1 : Form
    private string sourceUrl = "http://test.test";
    private string htmlsTargetDirectory = "Test Htmls";
    private string appDir = Path.GetDirectoryName(@"C:\Users\chocolade1972\AppData\Local\Test_Images\Test Images\Test Htmls");
    private string combinedHtmlsDir;
    private String targetHtmls;
    private int counter = 1;
    private StreamWriter w;
    private string uri;
    private bool htmlloaded = false;
    public Form1()
    InitializeComponent();
    webBrowser1.ScriptErrorsSuppressed = true;
    combinedHtmlsDir = Path.Combine(appDir, htmlsTargetDirectory);
    if (!Directory.Exists(combinedHtmlsDir))
    Directory.CreateDirectory(combinedHtmlsDir);
    private void Form1_Load(object sender, EventArgs e)
    GetHtmls();
    timer1.Enabled = true;
    private void GetHtmls()
    uri = sourceUrl + counter;
    targetHtmls = (combinedHtmlsDir + "\\Html" + counter + ".txt");
    webBrowser1.Navigate(uri);
    private void webBrowser1_DocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e)
    if (e.Url.ToString() == uri)
    targetHtmls = (combinedHtmlsDir + "\\Html" + counter + ".txt");
    htmlloaded = true;
    StreamWriter writer = File.CreateText(targetHtmls);
    writer.Write(webBrowser1.DocumentText);
    writer.Close();
    FileInfo fi = new FileInfo(targetHtmls);
    var size = fi.Length;
    w = new StreamWriter(combinedHtmlsDir + "\\test.txt", true);
    w.WriteLine("File Size " + size);
    w.Close();
    private void timer1_Tick(object sender, EventArgs e)
    if (htmlloaded == true)
    uri = sourceUrl + counter;
    myurls.Add(uri);
    webBrowser1.Navigate(uri);
    htmlloaded = false;
    counter += 1;

  • Spooling of a query generates different file sizes for different databases

    Please help me regarding a problem with spooling. I spooled a query output to a file from two different database. In both the databases the table structure is the same and the output produced only one row. But the file size is different for the databases. How can this problem occur? Is there any database parameter need to be checked? Both the databases are in same version 10.2.0.1.0.
    before running the spool i did a
    sql> set head off feedback off echo off verify off linesize 10000 pages 0 trims on colsep ' '
    on both the sessions.
    In one database the filesize is *1463 bytes* and on the other the filesize is *4205 bytes*.
    Please help me to find out these discrepancies.

    hi Mario,
    I think you are not getting my point. Both the files contain the same output but their sizes are different. This is due to the no of blank spaces between columns. I wanted to clarify why there is a difference between two filesize when the query output is the same.

  • FCE file sizes are twice the size of Quicktime's - any advantage to FCE?

    When I capture footage from my Canon HV40 MiniDV using FCE, the file sizes are almost twice the size as when I capture using Quicktime's Device Native setting. Is there any advantage to capturing with FCE, as the quality seems the same? Or is there a way to reduce the file size being captured in FCE?
    I'd like to be able to capture in FCE because if there is a dropped frame in the recording, Quicktime stops recording, whereas FCE keeps going and starts a new file. But the larger file size in FCE puts me off.

    There is no difference in the two files. They are both DV Stream (.dv).
    If you look in the Movie Properties window you'll notice that both tracks (audio and video) have the same file size.
    Obviously, this can't mean that the audio track is really as large as the video track in data size. It means that the only way QuickTime can "separate" the stream into two editable tracks is to duplicate them (but ignore the video playback of the audio track portion).
    I don't use Final Cut but I would suspect it might make two files (audio and video) instead of a single file made by QuickTime Player Pro.
    Confusing, isn't it?

  • Is it possible to have same file size for different dpi?

    I changed one.TIFF file (300dpi, 1024X1332)  to .jpg files of four different dpi. But when I checked the four result jpg files, I found out that they are all in same file size and quality.( I also have checked the property of the files in the Windows.)
    I think more DPI means more data and more file size. Am I wrong?
    I use Photoshop CS 5.1(64bit, WINDOWS) - which is part of my Adobe Master Collection CS5.5.
    TIFF(300dpi, 1024X1332) ->
    1. JPG(72dpi, 1024X1332) : 306KB
    2. JPG(108dpi, 1024X1332) : 306KB
    3. JPG(144dpi, 1024X1332) : 306KB
    4. JPG(600dpi, 1024X1332) : 306KB
    I tested a few times more with different files. and same result comes out.(same file size for different dpi)
    Thanks in advance.

    Yes absolutely. Great observation.  PPI does not control the number of pixels, just how big they are.
    Now, if you change the PPI in the Image Size dialog with Resample checked.. then, that is a different story. In that case you will be changing the pixel dimension of an image (i.e.,changing the total number of pixels making up the image) while keeping its print size.
    In your test files, you will notice all the print sizes are different, because all you were telling Photoshop to do was change the size of the pixels (if or when the image is ever printed), which is really just a bit of metadata of the file.

  • Trying to download update to CoPilot Live and CoPilot GPS with maps.  files sizes are large and taking hours to download on wireless connection.  How can I download App updates and new maps while connected to PC and Itunes through hard wire internet link?

    Trying to download update to CoPilot Live and CoPilot GPS with maps.  Files sizes are large and taking hours to download on wireless connection.  How can I download updates and new maps while connected to PC and Itunes through hard wire internet link?

    I'm on my iPad, so I don't know if this is the page with an actual download. I don't see a button, but assume that is because I  am on an iPad. It is in the DL section of Apple downloads.
    http://support.apple.com/kb/DL1708

  • Copying files folders display different sizes

    I'm pretty sure I'm not crazy, but isn't it true that when you copy files from one drive to another, the sizes as reported in the Finder may not always be equal even though the actual files may be exactly the same. Nothing fancy here, just a manual drag and drop of about 20 folders with many files in each. In all cases, one drive is displaying consistently smaller folder sizes than the other by <1% and as much as 30%. Should I worry or is this just one of those oddities of sectors and formatting and so on?

    It can vary due to File Allocation Block sizes, but 30 % would either mean a really small file... 1,025 bytes requiring 2 1,024 size block & may report 2048 on one, or one drive is not HFS+, like perhaps FAT32, in which case whole Forks can get lost.
    Or are you talking about the Get info Window that reports in Bytes for size?

  • Get Info folder size is incorrect/too large

    I am getting an incorrect report from the Get Info command that one the folders in my 8TB array is topping 15TB! In fact, this folder only contains data (Red movie files) that amount to about 800GB total. This problem, in turn, is preventing me from copying that directory to a backup drive, as it is seen as being too large, which it is not.
    Of course this is impossible, as my array, as mentioned above, is only 8TB. And when I do a Get Info on the array drive icon itself, it reads correctly, listing the Capacity as 8TB, and the Available space at 1.8TB. So, it's just the folders that are being read incorrectly.
    I have not enabled or set up Time Machine, so I do not think that is the issue.
    I am new to Macs and really have no clue how to remedy this situation.
    ANY help would be appreciated.

    Thank you for the response, Eric. I've already tried that. Didn't make a difference.
    Since I can't afford to wait for this to be solved, and because I don't know how to solve this issue myself, I've gone ahead and deleted the offending folder, created a new folder with a different name, and re-populated it with output files from Davinci Resolve. In effect, I rendered my project out again from scratch. Unfortunately the same **** thing is happening! I'm creating a populated folder that Get Info reports as being 16 Terabytes, on an 8 Terabyte array!
    (*sigh*)

  • MP4/H264 file sizes very different

    I seem to have the problem that when I convert files from AVI to MP4 or H264 they become remain rather large.
    I converted a movie of 124 minutes to H264 and the file is 494 MB, while I also have a 154 minute movie which is only 3352 MB. In itunes info it says the 124 minute movie is low complexity h.264 320x240 film at bit rate 64 kbps (total bit rate 539 kbps). It is 494 MB or 3.9 MB/min
    the 154 minute movie info gives low complexity MPEG-4 320x240 film at bit rate 63 kbps (total bit rate 313 kbps). It is 352 MB or 2.3 MB/min
    I am using total video converter to convert, I leave all the settings at default, I just select H264 and change the dimensions to 320x240. Does anyone have any idea how i can get the h.264 or mpeg-4 filesto be smaller??

    H.264 is higher quality than MP4, therefore larger file size.

  • Copy and Paste on "Get Info"

    So I just switched to a Mac, and have a question. Why are you not able to copy and paste song info onto iTunes? For example, I am updating info for Blue Oyster Cult. However I want it to read Blue Öyster Cult. You know, with those 'things' over the O. However when I try and paste this onto the artist info in iTunes, it will not let me. Is there something I can do to sole this problem?

    Vitaminmax wrote:
    So I just switched to a Mac, and have a question. Why are you not able to copy and paste song info onto iTunes? For example, I am updating info for Blue Oyster Cult. However I want it to read Blue Öyster Cult. You know, with those 'things' over the O. However when I try and paste this onto the artist info in iTunes, it will not let me. Is there something I can do to sole this problem?
    Where are you trying to paste it? If it's into the song listing in your library, you need to click on the Artist, then hover over it until you can enter the data. The easier method is to Get Info on the song (or a group of songs), then paste the info into the Artist field in the Get Info window. If iTunes won't let you do that, then something's amiss with the song file.
    Incidentally, those "things" are known as an umlaut or diaeresis. You can type Ö on a Mac by typing option-u, then shift-O.

  • LMS3.0.1 Log files' size are not purged.

    There is a LMS 3.0.1.
    Every day size of all log files only are increased.
    I have a 12G particion for log files but it's full every 2 weeks.
    In this case I have to stop LMS, delete all log files and start LMS manually every two weeks.
    Also, there is following message:
    CiscoWorks Folder Utilization is 8%Processes DFMCTMStartup, DataPurge are down.
    I tried to start these services several times but it failed all time.
    Hot to make LMS to prevent of increasing of log files over Recommended Size Limit?

    These two processes should be done in normal operation.  Do not try and start them manually.
    To rotate log files periodically, you will need to configure logrot.  Search the Common Services online help for "logrot", and you will get the instructions for configuring it.  Essentially, you will need to run NMSROOT/bin/perl NMSROOT/bin/logrot.pl -c, then walk through the menu prompts to configure the files you want to rotate.

Maybe you are looking for

  • FCP studio keeps quiting unexpectedly

    FCP keeps quiting "unexpectedly".  Sometimes this happens whilst I'm sending work to FCP from motion.  Other times its just some unexplainable reason when it quits on me. Here below is a log of the crash (hopefully).  Can someone help out please! Pro

  • Mail Form Refresh Problem

    hi to all, ı have a problem with CRM Web UI Mail Form screen. I can not write anything to text area. When i try it, it is always refreshing the page. Could someone help me? Could anyone suggest a note etc? Sincerely, Ümit Yılmaz

  • Smooth Scrolling : folio size VS article size

    Hi I'd like to create a smooth scrolling article just like in DPS TIPS "Effects" by Bob Bringhurst. I follow the steps from the article but it does not work. What I want to to is to have a 1024*768 (horizontal only) folio that includes articles that

  • Problem with integrated webcam

    I have a problem with my integrated webcam. It's working but there's a lot of glitches so it's impossible to use to record or to skype with. I've tried to chose a lower quality for it but it doesn't matter. What can I do?

  • Why my macbook pro getting so hot while using two or more simple application???

    Hi all, for last 4 month i have been used macbook pro. at the begin i didn't face any trouble. recently i found that when i ran two or more application at a time my notebook began start hot. it's very scaring for me. what shall i do now. one more thi