What Is A big file?

Hi All. I'm making a smart folder for big files when the question hit me. When does a file become a big file? 100MB? 500MB? 1GB? I would love to hear anyone's opinions.

It's all relative. Its fair to define the relative size of a file in terms of entropy and utility, both of which are measurable (entropy empirically, utility through observation).
The entropy is a measure of the randomness of the data. If it's regular, it should be compressible. High entropy (1 per bit) means a good use of space, low entropy (0 per bit), a waste.
Utility is simple, what fraction of the bits in a file are ever used for anything. Even if it's got a high entropy, if you never refer to a part of the file, it's existence is useless.
You can express the qualitative size of a file as the product of the entropy and the utility. Zero entropy means the file itself contains just about no information -- so, regardless how big it is it's larger than it needs to be. If you never access even one bit of a file, it's also too big regardless how many bits are in the file. If you use 100% (1 bit per bit) of a file, but it's entropy is 0.5 per bit, then the file is twice as big as it needs to be to represent the information contained in it.
An uncompressed bitmap is large, whereas a run-length encoded bitmap that represents the exact same data is small.
An MP3 file is based on the idea that you can throw away information in a sound sample to make it smaller, but still generate something that sounds very similar to the original. Two files representing the same sound, but the MP3 is smaller because you can sacrifice some of the bits because you are lowering the precision with which you represent the original (taking advantage that human perception of the differences is limited).
So, 100M would seem like a ridiculous size for a forum post, but it sounds small for a data file from a super-high resolution mass spectrometer.

Similar Messages

  • What are the good ways to send a big file( 20MB-100MB) file to my friend?

    what are the good ways to send a big file( 20MB-100MB) file to my friend?
    Thanks in advance

    if this is with the internet, iChat is probly your best bet,
    but if you just want a transfer,
    plug a firewire into both of your computers, shutdown one of them, hold "T" and press the power button, the restarted computer should pop up as an external drive on the second computer.

  • I'd like Time Machine to backup my personal account's files separately from my guest account's files. Both have big files.  What's the best way to do this?  Should I connect 2 external HD to my new iMac, one for each account?

    I'd like Time Machine to backup my personal account's files separately from my guest account's files. Both have big files.  What's the best way to do this?  Should I connect 2 external HD to my new iMac, one for each account?

    NeuroBrain wrote:
    Since my new external hard drive is have a lot of space, I'm thinking of splitting it for Time Machine and external storage.
    This is a common mistake and I highly advise against it.
    1: TimeMachine saves states of changes and thus requires more room on the TM drive than the boot drive it's backing up.
    2: Something happens to the TM drive, loss, theft, dropped, power surge, etc., you lose both backups.
    3: The storage drive might become a portable need, with it being on the TM drive, now your increasing the risk to the TM backup that something could happen to it along with the storage drive, due to increased movement.
    Seriously, have a read,
    Most commonly used backup methods
    it's ASC User Tip that saves us regulars all the trouble of having to repeat ourselves over and over again in the posts, because we tend to forget things too, or not here sometimes etc.
    "Plan for the worst and the good will take care of itself" - Donald Trump

  • Adobe Photoshop CS3 collapse each time it load a big file

    I was loading a big file of photos from iMac iPhoto to Adobe Photoshop CS3 and it keep collapsing, yet each time I reopen photoshop it load the photos again and collapse again. is there a way to stop this cycle?

    I don't think that too many users here actually use iPhoto (even the Mac users)
    However, Google is your friend. A quick search came up with some other non-Adobe forum entries:
    .... but the golden rule of iPhoto is NEVER EVER MESS WITH THE IPHOTO LIBRARY FROM OUTSIDE IPHOTO.In other words, anything you might want to do with the pictures in iPhoto can be done from *within the program,* and that is the only safe way to work with it. Don't go messing around inside the "package" that is the iPhoto Library unless you are REALLY keen to lose data, because that is exactly what will happen.
    .....everything you want to do to a photo in iPhoto can be handled from *within the program.* This INCLUDES using a third-party editor, and saves a lot of time and disk space if you do this way:
    1. In iPhoto's preferences, specify a third-party editor (let's say Photoshop) to be used for editing photos.
    2. Now, when you right-click (or control-click) a photo in iPhoto, you have two options: Edit in Full Screen (ie iPhoto's own editor) or Edit with External Editor. Choose the latter.
    3. Photoshop will open, then the photo you selected will automatically open in PS. Do your editing, and when you save (not save as), PS "hands" the modified photo back to iPhoto, which treats it exactly the same as if you'd done that stuff in iPhoto's own editor and updates the thumbnail to reflect your changes. Best of all, your unmodified original remains untouched so you can always go back to it if necessary.

  • I've doubled my RAM but still can't save big files in photoshop...

    I have just upgraded my Ram from the shipped 4GB to 8GB but I still can't save big images in photoshop. It seems to have made no difference to the speed of general tasks and saving and I can't save big files in photoshop at all. I already moved massive amounts of files off my computer and onto my external hard drive, so there is much less in my computer now, and twice the RAM but it is making no noticeable difference. When I click memory, under 'about my mac' it shows that the RAM is installed and now has twice the memory. Could this be something to do with photoshop? I'm running CS6.

    Also, I just calculated 220 cm into inches and this is roughly 86.5 inches. Over 7 feet in length.
    WIth an image that large, you may want to consider down sampling to a lower DPI to help with being able to work on and process images of this size much easier and it will be easier for your Print house to process, as well.
    You might want to consider working with these rather large images at 225 DPI instead of 300 if resolution at close distances is still a concern.
    Or what you good try is working with the images at 300 DPI, but then save them/export them as a Jpeg image at the highest image quality setting.
    I do a lot of project where I use a high resolution jpeg image to save on image proceesing overhead.
    The final printed images still come out pretty clear, clean and crisp without having to process those large files at a print house or having to deal with using that full resolution image in a page layout or illustration program.

  • Error in loading big file

    Hi All,
    I have an application on WL8.1 with sp3, the database is SQL Server 2000. I have code below to load local file into database. The data type in database is image.
    PreparedStatement pStatement = null;
    InputStreammyStream myStream = new InputStream();
    myStream.setEmbeddedStream( is );
    pStatement.setBinaryStream( 1, myStream, -1 );
    pStatement.executeUpdate();
    pStatement.close();
    pStatement = null;
    is is InputStream from a local file and the sql statement is
    insert into file_content(content) values(?)
    This workes fine for the files with size less than 150M, but for those big files (>150M), it doesn't work and I got error message as below:
    <Feb 11, 2005 12:00:41 PM PST> <Notice> <EJB> <BEA-010014> <Error occurred while attempting to rollback transaction: javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    at weblogic.transaction.internal.ServerTransactionImpl.internalRollback(ServerTransactionImpl.java:396)
    at weblogic.transaction.internal.ServerTransactionImpl.rollback(ServerTransactionImpl.java:362)
    Any body can help? Thanks in advance.

    Fred Wang wrote:
    Hi All,
    I have an application on WL8.1 with sp3, the database is SQL Server 2000. I have code below to load local file into database. The data type in database is image.
    PreparedStatement pStatement = null;
    InputStreammyStream myStream = new InputStream();
    myStream.setEmbeddedStream( is );
    pStatement.setBinaryStream( 1, myStream, -1 );
    pStatement.executeUpdate();
    pStatement.close();
    pStatement = null;
    is is InputStream from a local file and the sql statement is
    insert into file_content(content) values(?)
    This workes fine for the files with size less than 150M, but for those big files (>150M), it doesn't work and I got error message as below:
    <Feb 11, 2005 12:00:41 PM PST> <Notice> <EJB> <BEA-010014> <Error occurred while attempting to rollback transaction: javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    at weblogic.transaction.internal.ServerTransactionImpl.internalRollback(ServerTransactionImpl.java:396)
    at weblogic.transaction.internal.ServerTransactionImpl.rollback(ServerTransactionImpl.java:362)
    Any body can help? Thanks in advance.I already answered this as to the cause, in the ms newsgroups... It's the DBMS choking so
    bad on the size of your image file that it actually kills the connection. Note that the DBMS
    has to save the whole image to log as well as to the DBMS table. The fundamental response is
    to get DBA help to configure the DBMS to be able to do what you want. If you want and can,
    you could split your image column into multiple image columns, and split your data into
    100meg chunks, and then enter an empty row and then update it column by column, and then
    concatenate the data again in the cleint when you get it out, etc, but much better to post
    to the MS server newsgroups for ideas.
    Joe

  • Can't move big files also with Mac OS Extended Journaled!

    Hi all =)
    This time I can't really understand.
    I'm trying to move a big file (an app 9 GB large) from my macbook app folder to an external USB drive HFS+ formatted, but at about 5.5 GB the process stop and I get this error:
    "The Finder can't complete the operation because some data in "application.app" can't be read or written. (error code -36)"
    Tried to search for this error code over the internet with no results.
    Tried the transfer of the same file to a different drive (which is also HFS) but I still get the same error code.
    Both drives have plenty of available free space.
    Tried also  different USB cables.
    The app in subject has just been fully downloaded with no error from the app store.
    What should I try now? Please any suggestion welcome.. this situation it's so frustrating!
    Thanks

    LowLuster wrote:
    The Applications folder is a System folder and there are restrictions on it. I'm not sure why you are trying to copy an App out of it to an external drive. Try copying it to a folder on the ROOT of the internal drive first, a folder you create using a Admin account, and then copying that to the external.
    Thanks a lot LowLust, you actually solved my situation!
    But apparently in this "forum" you can't cancel the choosen answer if by mistake you clicked on the wrong one and you can't edit your messages and you even can't delete your messages... such a freedom downhere, jeez!

  • Out of memory when big files are being sent to N80...

    i downloaded a few big mp3 files (15-20mb) onto my 128 card i got with the phone, as soon as my new 2gb phone came, i didnt have a reader for my pc and my wife said she wanted these songs on her S.E.810i. i sent them over to her using bluetooth (with no probs whatsoever) in the hope i could send them back to my phone with my new 2gb card in. when i try to send them back from the 810i with bluetooth, i get out of memory message but i have next to nothing on phone mem - no vid/images/sound etc. can anyone tell me if i would be right in saying that when i download from my home network the files are temp stored on mem card until you confirm where u want to save whereas bluetooth tries to store on phone mem temporarily first? hope u got all that! if this is the case, is there anyway i can change this? if anyone is still with me, and knows what i am talking about, please enlighten me!! its driving me mad!
    thanks

    1. All messages will be stored/moved to the memory card.
    2. I doubt it. The phone will still save files received via bluetooth in a messaging folder. If this folder is stored on the phone memory there will never be enough space to receive really big files. dmz

  • How to read big files

    Hi all,
    I have a big text file (about 140MB) containing data I need to read and save (after analysis) to a new file.
    The text file contains 4 columns of data (so each row has 4 values to read).
    When I try to read all the file at once I get a "Memory full" error messags.
    I tried to read only a certain number of lines each time and then write it to the new file. This is done using the loop in the attached picture (this is just a portion of the code). The loop is repeated as many times as needed.
    The problem is that for such big files this method is very slow and If I try to increase the number of lines to read each time, I still see the PC free memory decending slowly at the performance window....
    Does anybody have a better idea how to implement this kind of task?
    Thanks,
    Mentos.
    Attachments:
    Read a file portion.png ‏13 KB

    Hi Mark & Yamaeda,
    I made some tests and came up with 2 diffrenet aproaches - see vis & example data file attached.
    The Read lines aproach.vi reads a chunk with a specified number of lines, parses it and then saves the chunk to a new file.
    This worked more or less OK, depending on the dely. However in reality I'll need to write the 2 first columns to the file and only after that the 3rd and 4th columns. So I think I'll need to read the file 2 times - 1st time take first 2 columns and save to file, and then repeat the loop and take the 2 other columns and save them...
    Regarding the free memory: I see it drops a bit during the run and it goes up again once I run the vi another time.
    The Read bytes approach reads a specified number of bytes in each chunk until it finishes reading all the file. Only then it saves the chunks to the new file. No parsing is done here (just for the example), just reading & writing to see if the free memory stays the same.
    I used 2 methods for saving - With string subset function and replace substring function.
    When using replace substring (disabled part) the free memory was 100% stable, but it worked very slow.
    When using the string subset function the data was saved VERY fast but some free memory was consumed.
    The reading part also consumed some free memory. The rate of which depended on the dely I put.
    Which method looks better?
    What do you recommand changing?
    Attachments:
    Read lines approach.vi ‏17 KB
    Read bytes aproach.vi ‏17 KB
    Test file.txt ‏1 KB

  • Problem with loading of a big file attached message

    I am in holidays and I do not have here my fast DSL-connection, but only the internal 56-k-Modem from my laptop. Someone had sent me a message including a big file. I want to load this message but everytime I try, it stops loading before this message is arrived in my mailbox. Now it blocks all the other messages just to download. What can I do? Can anybody help me. A big THANKS in advance!

    Since you have a laptop, go to some location that has free wireless. Download the mail from there.
     Cheers, Tom

  • Broken Ftp Connection and big files problem

    I have a problem with big-files downloading.
    Does anybody know how to resume downloading using FTP-connection?
    Or how can I get bytes from FTP connected file using something like random access to avoid the restart of downloading the file?
    "InputStream" does not support "seek"-like methods.

    From RFC 959
    RESTART (REST)
    The argument field represents the server marker at which
    file transfer is to be restarted. This command does not
    cause file transfer but skips over the file to the specified
    data checkpoint. This command shall be immediately followed
    by the appropriate FTP service command which shall cause
    file transfer to resume.
    You should also be aware of RFC 959 Section 3.4.2 on BLOCK MODE transfers which is what allows FTP to REST a connection and "skip" n-bytes of a file.

  • What is te max. file size for lightroom

    i have some very big files. lightroom says that the are to big to catalog. but what is the max. file size ?

    There is no MAX file size.
    As John points out, the limits are on the number of pixels, not file size

  • Suggestion needed for processing Big Files in Oracle B2B

    Hi,
    We are doing a feasibility study for Using Oracle AS Integration B2B over TIBCO. We are presently using TIBCO for our B2B transactions. Now since my client company planning to Implement Fusion Middleware (Oracle ESB and Oracle BPEL), we are also looking at Oracle AS Integration B2B for B2B transactions (On other words we are planning to replace TIBCO by Oracle Integration B2B if possible).
    I am really concern about one thing that is receiving and processing any "BIG FILE" (15 MB of size) from trading partner.
    Present Scenario: One of our trading partner is sending Invoice documents in a single file and that file size can grow upto 15 MB of size. In our existing scenario when we receive such big files from trading partner (through TIBCO Business Connect - BC), Tibco BC works fine for 1 or 2 files but it crashes once it received multiple files of such size. What exactly happening is Whatever Memory that TIBCO BC is consuming to receive one such big file, are not getting released after processing and as a result TIBCO BC throws "OUT OF MEMORY" error after processing some files.
    My questions:
         1. How robust the Oracle AS Integration B2B is, in terms of processing such big files?
         2. Is there any upper limit in terms of size that Oracle AS Integration B2B can handle for receiving and processing data?
         3. What is the average time required to receive and process such big file? (Lets say we are talking about 15MB of size).
         4. Is there any documentation availble that talks about any such big files through Oracle B2B?
    Please let me know if you need more information.
    Thanks in advance.
    Regards,
    --Kaushik                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Hi Ramesh,
    Thanks for your comment. We will try to do POC ASAP. I will definitely keep in touch with you during this.
    Thanks bunch.
    Regards,
    --Kaushik                                                                                                                                                                                                                                                                                                                               

  • Keeping "CS Web Service session" alive while uploading big files.

    Hi.
    I have a problem when I'm uploading big files, which takes longer than the session timeout value, causing the upload to fail.
    As you all know uploading a file is a three step process:
    1). Create a new DocumentDefinition Item on the server as a placeholder.
    2). Open an HTTP connection to the created placeholder and transfer the data using the HTTPConnection.put() method.
    3). Create the final document using the FileManager by passing in the destination folder and the document definition.
    The problem is that step 2 take so long that the "CS Web Service Session" times out and thus step 3 can not be completed. The Developer guide gives a utility method for creating an HTTP connection for step 2 and it states the folllowing "..you must create a cookie for the given domain and path in order to keep the session alive while transferring data." But this only keeps the session of the HTTP Connection alive and not the "CS Web Service Session". As in my case step 2 completes succesfully and the moment I peform step 3 it throws an ORACLE.FDK.SessionError:ORACLE.FDK.SessionNotConnected exception.
    How does one keep the "CS Web Service Session" alive?
    Thanks in advance
    Regards.

    Okay, even a thread that pushes dummy stuff through once in a while doesn't help. I'm getting the following when the keep alive thread kicks in while uploading a big file.
    "AxisFault
    faultCode: {http://xml.apache.org/axis/}HTTP
    faultSubcode:
    faultString: (409)Conflict
    faultActor:
    faultNode:
    faultDetail:
    {}:return code: 409
    &lt;HTML&gt;&lt;HEAD&gt;&lt;TITLE&gt;409 Conflict&lt;/TITLE&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;H1&gt;409 Conflict&lt;/H1&gt;Concurrent Requests On The Same Session Not Supported&lt;/BODY&gt;&lt;/HTML&gt;
    {http://xml.apache.org/axis/}HttpErrorCode:409
    (409)Conflict
         at org.apache.axis.transport.http.HTTPSender.readFromSocket(HTTPSender.java:732)
         at org.apache.axis.transport.http.HTTPSender.invoke(HTTPSender.java:143)
         at org.apache.axis.strategies.InvocationStrategy.visit(InvocationStrategy.java:32)
         at org.apache.axis.SimpleChain.doVisiting(SimpleChain.java:118)
         at org.apache.axis.SimpleChain.invoke(SimpleChain.java:83)
         at org.apache.axis.client.AxisClient.invoke(AxisClient.java:165)
         at org.apache.axis.client.Call.invokeEngine(Call.java:2765)
         at org.apache.axis.client.Call.invoke(Call.java:2748)
         at org.apache.axis.client.Call.invoke(Call.java:2424)
         at org.apache.axis.client.Call.invoke(Call.java:2347)
         at org.apache.axis.client.Call.invoke(Call.java:1804)
         at oracle.ifs.fdk.FileManagerSoapBindingStub.existsRelative(FileManagerSoapBindingStub.java:1138)"
    I don't understand this, as the exception talks about "Concurrent Requests On The Same Session", but if their is already a request going on why is the session timing out in the first place?!
    I must be doing something really stupid somewhere. Aia ajay jay what a unproductive day...
    Any help? It will be greatly appreciated...

  • Download big files

    I am trying to download a big file from the internet. Currently I am doing the following:
                URLConnection conn = downloadURL.openConnection();
                InputStream in = conn.getInputStream();
                OutputStream out = new FileOutputStream(dst);
                // Transfer bytes from in to out
                byte[] buf = new byte[SIZE];
                int len;
                while ((len = in.read(buf)) > 0 && !cancelled) {
                    out.write(buf, 0, len);
                }That code works for me most of the time, but sometimes the file is not downloaded correctly and I do not know how to test if the download was completed or not, and I do not know how to guarantee that the file is downloaded completely.
    Is there some way to make this?
    Greetings,
    Magus

    That statement makes no sense. I'm programming in Java, and Java has a fine mechanism for throwing exceptions.No, that statement makes no sense. Your Java code is talking TCP/IP to a server, which doesn't have such a mechanism. It can close or reset the connection; that's it. If it had reset the connection, your read() would block forever, or timeout. If it had closed the connection, your read() would have returned -1, as it did, so that is what happened.
    1. Can you expound upon what makes you think that is the reason that -1 was returned, as opposed to the connection timing out, the connection dropping, etc.?Because no exception was thrown. Or else one was thrown and you have swallowed it, but the code you posted doesn't indicate that.
    2. Can you indicate why Java would return a -1 instead of throwing a some I/O exception?Because the server, or possibly an intermediate firewall, closed the connection, rather than Java incurring some exception at the client end such as a timeout. The documentation you quoted at me before bears that out. That's what the -1 means. You quoted that at me yourself.
    The whole point here is that, according to the API documentation, -1 apparently indicates that the end of the stream was reached normally, and that error conditions are indicated by exception.Exactly so. So the server closed the connection normally, or an intermediate firewall did, but before it had sent all the data. Why is another question. Have a look at the server logs, or investigate the firewall configuration.
    I guess I'm asking if anyone has seen this behavior, and has any insight on why it doesn't seem to follow the API.It does.
    you seem to think that this behavior is in line with the API documentation; I disagree.Well you've quoted enough of it yourself: have another look, or another think. Premature (but normal) closing of the connection by the server or a firewall is the only possible explanation for what you're seeing. If it wasn't closed you wouldn't be getting the -1; if there was a timeout you would get a SocketTimeoutException; if there was any other exception you would catch that.
    I've seen plenty of short downloads in my life, but the server doesn't have a way of indicating that via TCP/IP. You have to count.
    NB some firewalls do close long-lived connections on their own account. Is there a client-side firewall that might be doing that.
    Or else the transmitted content-length is wrong; for example, it overflows an integer.

Maybe you are looking for

  • IMac will not boot up, all I get is white screen help

    MY iMac will not boot up, all I get is white screen? what do I need to do to start machine please?

  • Update statement problem for jdbc adapter

    Hi all In the jdbc sender adapter, I configured as follows: Query statement select * from pickdiff where tid is null Update statement delete from pickdiff where tid is null I got following error message: Database-level error reported by JDBC driver w

  • Can i patch oracle database on  RAC and not patch the RAC software

    Hi there Kindly advise on implications of doing the following 1 i have oracle cluster version 10.2.0.1.0 running 2 nodes 2 i have database version 10.2.0.1.0 i have 4 different databases I want to patch only one database question !! Can i patch the d

  • Cell borders not expanding using nested subforms

    I have an existing form whose table was created using subforms (not a table object). I found this post which addresses my issue: Cell Borders in tables of adobe form. However, I'm still having an issue and I'm hoping someone can point me in the right

  • DMS Code for opening documents

    Hi I want to write a program to view and change documents uploaded through DMS(tr CV01n,CV02n,CV03n). Can anyone help me with a progarm or share some knowledge regarding how to implement these requirement. Is there any class or function module to ope