Average file Size

Hi Gurus,
What will be the size of the file format (word, PDF, XL, CAD)  to upload in SAP DMS. Is there is any restrictions in MB's or GB's for the files to upload?.( OR) something like we can upload maximum ?
If its there what is the size of the each format WRD, PDF, XL, etc.
Regards
Kannan

Hi Kannan,
IMHO,its the other way round that you should be working on this i.e, sort/segregate your files,understand their properties, decide/compute average size etc and then on decide how large/small file must be permitted.
Typically, there are two approaches to control/limit file size.
One is via the field 'File Size' when you are defining Document types(Cross-App components>DMS>Control data>Define doc types>File size - field: FILELEN).This setting shows the maximum size (in bytes) of an original application file for archiving to the SAP database.The value is only checked when the original files are not stored via Knowledge Provider.
Else,if yours is kPro/content server scenario,then would would suggest you work backwards to arrive at your content server size which will permit file uploading as indicated in the thread below
DMS server sizing calculation
P.S. Additionally,check your IIS settings as well,there is a provision to introduce upload/download size limit.
Regards,
Pradeepkumar Haragoldavar

Similar Messages

  • Average file size for an app

    Apparently the smaller the file size the better  but I would like to know what's the average or "ideal" file size for an app that contains videos/audio?

    That will depend on the platform you are aiming the game for. File>New will give you some ideas in the presets.

  • What is the Average File Size for a single Episode of a TV show for Itunes?

    Say a TV show that runs around 22 minutes or so in length. What would be the File size of the file you download from Itunes?

    I don't know about 22 minutes.
    The free Passion episodes are around 37 mintues and range from 171-179 MB.

  • PDF file size

    I have an HP Officejet 6500 E710n-z (Network) that I use at home to scan into PDFs, and I know exactly how to move the slider from "Smallest size" to "Best quality" 
    However the size of the PDFs are unacceptably large when image quality is acceptable, and if I move the slider to reduce the size then the image becomes unacceptable.  It is impossible to scan a legal document of more than a few legible pages without producing a file size too large to email .  I am forced to break these scanned documents into 3-4 page bite-sized chunks.
    A simple one page HOA Disclosure form, at 220 DPI and the slider in the middle for balance, produces a 699K PDF.  I have used other printer scanners at work (different brands) at the same 200 DPI that results in very clear documents at less than 100K per page. 
    I believe there is something wrong with HP's scan-to-PDF algorithm.  The problem must be due to some unskilled (or flawed) software design.  What will it take to have HP or third party developer (and developer staff supervision) take this seriously -- compare HP vs other brand scanners PDF files -- and update the HP drivers to fix this?
    This question was solved.
    View Solution.

    Just a follow up.  I went into chat mode with some low-level tech named Nathan, giving him the link to this thread.  After reading it, he suggested I can improve file size by using greyscale and 200 DPI.  DUH!!!!  I complained he was just humoring me, so he said I should call the tech support number.
    I did that next and spoke with a very nice young lady who actually DID take me seriously.  I could hear her typing away vigorously in the background, capturing every detail of my plea for this to be forwarded up the chain of command for serious consideration.  It was clear she understood and captured from me that there are many forum complaints that can be found with a search term "PDF File Size" that are getting weak or unacceptable "solutions" to push the slider left or use lower DPI etc.  She also captured my assurances that some competitor brands produce PDF file sizes 20 times smaller for the same image quality.
    I believe that this might actually be opened up at a higher level for consideration of an improvement in the compression algorithm.  My fingers are crossed.  Since I have about 11 months left for warranty support, I plan to contact them once or twice again before it expires, using the same case number to see if there is any progress.
    Bottom line:  At my default of 200 DPI and 20% image quality, with an average file size for a single sample page producing a file size of 281KB, a 25-page document creates a PDF file that is 7MB!  That will just barely make it past the file size limit for my email provider, but might be too large for the recipient.  That is still unacceptable, and is forcing me to consider products other than HP for this business purpose.

  • File size comparisons, InDesign CS3, CS4 and CS5?

    Hi, all.
    It seems there was a trend for several major releases where each time Adobe released a new version of InDesign and InCopy, average file sizes grew by 20% or so from the old release to the new release, at least back in the older CS days. Has this trend continued, such that file sizes in CS4 are substantially larger than file sizes from CS3, and CS5 files are substantially larger than those from CS4?
    Adobe, of course, wants to keep writing functionality that will keep the user community coming back to buy in to upgrades. The added functionality sometimes comes at a cost beyond the price tag. If file sizes are larger in a newer version, then page saves over a network or to a database are likely to be slower, and user productivity takes a hit while users are waiting for files to be saved.
    Has anyone done any testing to build the "same" page in multiple versions of InDesign to understand what the effect is on the file sizes? I'm specifically interested in the file sizes between InDesign CS3, CS4 and CS5. To be meaningful, the test page would have to be at least moderately complex, with a couple of photos, multiple text elements and so forth. By "same," the page wouldn't take advantage of new functionality in newer versions but would be saved as a native page of the current version, though the file sizes may be bloated by the new functionality like it or not.
    If you've done any testing along these lines, I'd like to hear more about it.
    Thanks.
       Mark

    The overall structure of ID's files have been exactly the same for -- as far as I can see back -- CS. No change at all, in there.
    There have been significant additions to the 'global' spaces; stuff like InCopy user data, table styles, object styles, and cross references. Each of these add a major chunk of data to each file, whether you use it or not, plus a few bytes per object (again, whether you use them or not -- ID also needs to know where you did not use them, that's why). I think these might be the main source of 'global' file size increase (a single object style in one of my files, for example, eats up a healthy chunk of 11,482 bytes).
    For the rest, all new stuff like 'span columns' is a handful of bytes per paragraph style. Tracking changes may very well double the size of text runs -- but 1 character takes up 1 byte of storage (plus perhaps some overhead of indicating its 'tracking' status). Any single measurement unit always uses 8 bytes at least (for example, the left inset for a column span -- even if it's not used, or set to 0pt).
    I think we're talking about a couple of K's here (oh -- perhaps a max of a hundred or so), in a file format that has been designed around the concept of "disk pages", each 4K big, meaning that sometimes adding one single character to a text box increases the file size by 4 K.
    Your idea of comparing the size of a file created in CS3 against the same saved as CS4 and as CS5 is certainly feasible -- I might try it some time, just to confirm it's purely the extra 'new objects' data that accounts for the size increase and to confirm my guesstimates of the number of Ks involved.

  • File sizes - Click through selection

    How bout looking at the ballooning file sizes?
    I just got pissed off at how slow illustrator was (using the new cs4 version and "mistakenly" left a couple layers pallet open) and did a copy-paste into Freehand and finished the job.
    When done the freehand EPS file was 52k.
    Illustrator's file size for the same project? 1.1 MB
    Good Lord! import that into an InDesign file . . .
    Click through -
    If they could use a "click through" model to access items beneath layers like in inDesign and Freehand they wouldn't need to have all these layers and tiny layer previews for every tiny little piece of art on the page.
    Maybe that would save some space. Sure hard drives are getting cheaper all the time, but why are these files becoming so huge?

    Kirby,
    are you recommending to downgrade the original image files, even if the JPEGs in question are the only copies? JPEG compression is lossy and creates horrible artfacts, if it is applied too heavily. But even with only a sllght JPEG compression applied, it is difficult, to apply any adjustments, like sharpening, after the compression.
    My iPhone photos are 8MP in size and the average file size is 3MB. I am quite happy with that amount of storage and would not compress the originals further.
    As a test:
    To the left, the original JPEG with 3.85 MB and to the right a compressed version with 598 KB.
    Looking at the pixels (the sun hat at the bottom of the picture) : the stronger compression hides the details:
    Or the pine cones:
    I don't apply any JPEG compression to original files, beyond what the camera does anyway.

  • How to reduce file size when using batch processing?

    I use File > Process Multiple Files to batch process photos to a smaller file size along with adding my watermark.  I've played with many different settings and no matter what I choose, I can't get my average file size to be less than about 200k.  However, when I've exported the same photos using iPhoto, I can get the file size to about half of that with no difference (to my naked eye at least) in quality.
    I definitely want to keep the height at 768 pixels so that needs to stay constant.
    My current settings in batch processing (average file size = 200 kb)
    Resize images with a height constraint of 768 pixels at 150dpi
    What I've tried:
    Resize at 72dpi (reduced file size by about 5kb)
    convert file to JPEG low quality (reduced file size by about 10kb)
    convert file to JPEG medium quality (not much difference in file size)
    I'm using PSE 10 on a Mac running Lion.
    Thank you in advance for your help!

    You should go with default settings of Optimizer.
    One difference between default settings of Optimizer and Reduce file size is that Optimizer does not guarantee a reduction in file size (if your Optimizer settings lead to an increase in file size, that's what you will get).
    With Acrobat 9, the default setting in Optimizer has an additional setting which would not do an image optimization that results in increase of file size. In that sense it would in most cases give a smaller file.

  • IPad keynote file size

    what is the average file sizes for a one page, pages doc with no pics? a keynote 10 slides a pic on every slide and a small numbers spread sheet i want to use a ipad for school and want to know if ill need a 16 or 32 gb thanks for answers.

    Folks
    I've found a workaround.
    Not the perfect solution.
    But it turns out that if you use a WHITE or BLACK keynote presentation templates the memory issue fades away
    There is a bug on the CHALKBOARD and CERULEAN templates I was using.
    steps to reproduce
    use the same photo
    create several one page presentations
    email them to yourself
    watch how the file size changes according to the presentation you choose
    for now I'll use WHITE
    WHITE = 256 kb
    BLACK = 298 kb
    GRADIENT = 412 kb
    PHOTO PORTFOLIO = 412 kb
    RENAISSANCE = 549 kb
    SHOWROOM = 490 kb
    MODERN PORTFOLIO = 302 kb
    HARMONY = 482 kb
    PARCHMENT = 530 kb
    CRAFT = 441 kb
    CERULEAN = 1.7 Mb
    CHALKBOARD = 6.2 Mb
    There has to be a bug on the CERULEAN AND CHALKBOARD templates,

  • I recently downloaded a series of instructional videos.  They are in mp4 format and rather large (~500MB on average).  When I try and open them with Quicktime (default) they will not play.  Are there file size restrictions?  Any suggestions would be great

    I recently downloaded a series of instructional videos.  They are in mp4 format and rather large (~500MB on average).  When I try and open them with Quicktime (default) they will not play.  Are there file size restrictions?  I have gone through all of the most recent software updates.  Any suggestions would be great.  Thanks.

    Try VLC Media Player.  It has a reputation for playing just about anything you throw at it.

  • A simple and free way of reducing PDF file size using Preview

    Note: this is a copy and update of a 5 year old discussion in the Mac OS X 10.5 Leopard discussions which you can find here: https://discussions.apple.com/message/6109398#6109398
    This is a simple and free solution I found to reduce the file size of PDFs in OS X, without the high cost and awful UI of Acrobat Pro, and with acceptable quality. I still use it every day, although I have Acrobat Pro as part of Adove Creative Cloud subscription.
    Since quite a few people have found it useful and keep asking questions about the download location and destination of the filters, which have changed since 2007, I decided to write this update, and put it in this more current forum.
    Here is how to install it:
    Download the filters here: https://dl.dropboxusercontent.com/u/41548940/PDF%20compression%20filters%20%28Un zip%20and%20put%20in%20your%20Library%20folder%29.zip
    Unzip the downloaded file and copy the filters in the appropriate location (see below).
    Here is the appropriate location for the filters:
    This assumes that your startup disk's name is "Macintosh HD". If it is different, just replace "Macintosh HD" with the name of your startup disk.
    If you are running Lion or Mountain Lion (OS X 10.7.x or 10.8.x) then you should put the downloaded filters in "Macintosh HD/Library/PDF Services". This folder should already exist and contain files. Once you put the downloaded filters there, you should have for example one file with the following path:
    "Macintosh HD/Library/PDF Services/Reduce to 150 dpi average quality - STANDARD COMPRESSION.qfilter"
    If you are running an earlier vesion of OS X (10.6.x or earlier), then you should put the downloaded filters in "Macintosh HD/Library/Filters" and you should have for example one file with the following path:
    "Macintosh HD/Library/Filters/Reduce to 150 dpi average quality - STANDARD COMPRESSION.qfilter"
    Here is how to use it:
    Open a PDF file using Apple's Preview app,
    Choose Export (or Save As if you have on older version of Mac OS X) in the File menu,
    Choose PDF as a format
    In the "Quartz Filter" drop-down menu, choose a filter "Reduce to xxx dpi yyy quality"; "Reduce to 150 dpi average quality - STANDARD COMPRESSION" is a good trade-off between quality and file size
    Here is how it works:
    These are Quartz filters made with Apple Colorsinc Utility.
    They do two things:
    downsample images contained in a PDF to a target density such as 150 dpi,
    enable JPEG compression for those images with a low or medium setting.
    Which files does it work with?
    It works with most PDF files. However:
    It will generally work very well on unoptimized files such as scans made with the OS X scanning utility or PDFs produced via OS X printing dialog.
    It will not further compress well-optimized (comrpessed) files and might create bigger files than the originals,
    For some files it will create larger files than the originals. This can happen in particular when a PDF file contains other optomizations than image compression. There also seems to be a bug (reported to Apple) where in certain circumstances images in the target PDF are not JPEG compressed.
    What to do if it does not work for a file (target PDF is too big or even larger than the original PDF)?
    First,a good news: since you used a Save As or Export command, the original PDF is untouched.
    You can try another filter for a smaller size at the expense of quality.
    The year being 2013, it is now quite easy to send large files through the internet using Dropbox, yousendit.com, wetransfer.com etc. and you can use these services to send your original PDF file.
    There are other ways of reducing the size of a PDF file, such as apps in the Mac App store, or online services such as the free and simple http://smallpdf.com
    What else?
    Feel free to use/distribute/package in any way you like.

    Thanks ioscar.
    The original link should be back online soon.
    I believe this is a Dropbox error about the traffic generated by my Dropbox shared links.
    I use Dropbox mainly for my business and I am pretty upset by this situation.
    Since the filters themsemves are about 5KB, I doubt they are the cause for this Dropbox misbehavior!
    Anyway, I submitted a support ticket to Dropbox, and hope everything will be back to normal very soon.
    In the meantime, if you get the same error as ioscar when trying to download them, you can use the link in the blog posting he mentions.
    This is out of topic, but for those interested, here is my understanding of what happened with Dropbox.
    I did a few tests yesterday with large (up to 4GB) files and Dropbox shared links, trying to find the best way to send a 3 hour recording from French TV - French version of The Voice- to a friend's 5 year old son currently on vacation in Florida, and without access to French live or catch up TV services. One nice thing I found is that you can directly send the Dropbox download URL (the one from the Download button on the shared link page) to an AppleTV using AirFlick and it works well even for files with a large bitrate (except of course for the Dropbox maximum bandwidth per day limit!). Sadly, my Dropbox shared links were disabled before I could send anything to my friend.
    I may have used  a significant amount of bandwidth but nowhere near the 200GB/day limit of my Dropbox Pro account.
    I see 2 possible reasons to Dropbox freaking out:
    - My Dropbox Pro account is wronngly identified as a free account by Dropbox. Free Dropbox accounts have a 20GB/day limit, and it is possible that I reached this limit with my testing, I have a fast 200Mb/s internet access.
    - Or Dropbox miscalculates used bandwidth, counting the total size of the file for every download begun, and I started a lot of downloads, and skipped to the end of the video a lot of times on my Apple TV.

  • Index file increase with no corresponding increase in block numbers or Pag file size

    Hi All,
    Just wondering if anyone else has experienced this issue and/or can help explain why it is happening....
    I have a BSO cube fronted by a Hyperion Planning app, in version 11.1.2.1.000
    The cube is in it's infancy, but already contains 24M blocks, with a PAG file size of 12GB.  We expect this to grow fairly rapidly over the next 12 months or so.
    After performing a simple Agg of aggregating sparse dimensions, the Index file sits at 1.6GB.
    When I then perform a dense restructure, the index file reduces to 0.6GB.  The PAG file remains around 12GB (a minor reduction of 0.4GB occurs).  The number of blocks remains exactly the same.
    If I then run the Agg script again, the number of blocks again remains exactly the same, the PAG file increases by about 0.4GB, but the index file size leaps back to 1.6GB.
    If I then immediately re-run the Agg script, the # blocks still remains the same, the PAG file increases marginally (less than 0.1GB) and the Index remains exactly the same at 1.6GB.
    Subsequent passes of the Agg script have the same effect - a slight increase in the PAG file only.
    Performing another dense restructure reverts the Index file to 0.6GB (exactly the same number of bytes as before).
    I have tried running the Aggs using parallel calcs, and also as in series (ie single thread) and get exactly the same results.
    I figured there must be some kind of fragmentation happening on the Index, but can't think of a way to prove it.  At all stages of the above test, the Average Clustering Ratio remains at 1.00, but I believe this just relates to the data, rather than the Index.
    After a bit of research, it seems older versions of Essbase used to suffer from this Index 'leakage', but that it was fixed way before 11.1.2.1. 
    I also found the following thread which indicates that the Index tags may be duplicated during a calc to allow a read of the data during the calc;
    http://www.network54.com/Forum/58296/thread/1038502076/1038565646/index+file+size+grows+with+same+data+-
    However, even if all the Index tags are duplicated, I would expect the maximum growth of the Index file to be 100%, right?  But I am getting more than 160% growth (1.6GB / 0.6GB).
    And what I haven't mentioned is that I am only aggregating a subset of the database, as my Agg script fixes on only certain members of my non-aggregating sparse dimensions (ie only 1 Scenario & Version)
    The Index file growth in itself is not a problem.  But the knock-on effect is that calc times increase - if I run back-to-back Aggs as above, the 2nd Agg calc takes 20% longer than the 1st.  And with the expected growth of the model, this will likely get much worse.
    Anyone have any explanation as to what is occurring, and how to prevent it...?
    Happy to add any other details that might help with troubleshooting, but thought I'd see if I get any bites first.
    The only other thing I think worth pointing out at this stage is that we have made the cube Direct I/O for performance reasons. I don't have much prior exposure to Direct I/O so don't know whether this could be contributing to the problem.
    Thanks for reading.

    alan.d wrote:
    The only other thing I think worth pointing out at this stage is that we have made the cube Direct I/O for performance reasons. I don't have much prior exposure to Direct I/O so don't know whether this could be contributing to the problem.
    Thanks for reading.
    I haven't tried Direct I/O for quite a while, but I never got it to work properly. Not exactly the same issue that you have, but it would spawn tons of .pag files in the past. You might try duplicating your cube, changing it to buffered I/O, and run the same processes and see if it does the same thing.
    Sabrina

  • The file size of selected file in input file control is shown as 0 for multiple file selection in Safari 5.1

    The file size of selected file in input file control is shown as 0 for multiple file selection in Safari 5.1. If you select single file, then it is able to return file size correctly. However, if you select multiple files, then the file size of each of the selected file is always returned as 0 from javascript. This works correctly in Safari 4.0 but it does not work in Safari 5.1.
    How do I get the correct file size in Safari 5.1 ?

    If you want to post (or send me) a link to the lrcat file, I'd take a look at it for you, and give you a break-down what's consuming all the bytes. But it might be fun to learn how to do that yourself (e.g. using SQL). I use SQLiteSpy, but other people have their favorites.. (or you can use a command-line client if you prefer..). One way: just run "drop table "{table-name}" on each table then look at filesize (do this to a copy, not the real thing).
    Anyway, it's hard to imagine keywords and captions etc. taking much of the space, since even if you had 1000 10-character words of text metadata per photo average that still only adds up to 117MB, which isn't a substantial portion of that 8G you're seeing occupied.
    Anyway, if you've painted the heck out of most of them and not cleared dev history, that'll do it - that's where I'd put my money too...
    One thing to consider to keep file-size down:
    ===================================
    * After reaching a milestone in your editing, take a snapshot then clear edit history, or the top part of it anyway (e.g. leave the import step), using a preset like:
    Clear Edit History.lrtemplate
    s = {
        id = "E36E8CB3-B52B-41AC-8FA9-1989FAFD5223",
        internalName = "No Edit",
        title = "Clear Edit History",
        type = "Develop",
        value = {
            settings = {
                NoEdit = true,
            uuid = "34402820-B470-4D5B-9369-0502F2176B7F",
        version = 0,
    (that's my most frequently used preset, by far ;-})
    PS - I've written a plugin called DevHistoryEditor, which can auto-consolidate steps and reduce catalog size - it's a bit cumbersome to use a.t.m. but in case you're interested...
    Rob

  • Why is image file size getting inflated by Pages?

    I have a newsletter I am creating, and I have inserted 6 photos. They have been pre-downsized using Image Ready, so each image is around 30k. But after they have been inserted, the file size of the entire Pages document is huge. So I opened the Package for the document (Control click on the icon) and I see that now the images are each around 300k. The images have not been manipulated in Pages, no shadows or strokes. They have been resized just a teeny bit (made smaller in all cases).
    So what is going on to cause this huge inflation of file size. It is making my document too big to email (although it has helped to control the pdf export to "Good"... or even better to use the Color Sync file size reduction dialog, which is buried so deep in the Print dialog that I can't believe this is a common way to do it...)
    Anyway, all help would be appreciated. Thanks, Peter

    yes, you are right that the best results that I got were from the Print command and the color sync option to reduce file size. Without that, I took a Pages doc that was around 2.7megs and the pdf varied from 1.8-2.5 using the export command. With the color sync method I did get it down to a 450k pdf, which I am satisfied with.
    Still it baffles me that a 40k jpeg will inflate to 300k in Pages. I don't see multiple copies in the Package for the document, so I don't think it is multiplying. And I think that at least one of the images lives in the document without having been resized from within Pages, so that doesn't seem to be involved.
    I am using jpegs, adjusting them in Photoshop and then optimizing them with Image Ready, but they are JPEGs all thru the process, and they clearly are averaging around 35-40k after Image Ready does it's t hing. That is the whole purpose of Image Ready, it comes with Photoshop with that intention solely. I could try saving the image file in Preview or some other program and seeing if it inflates liek the Image REady jpegs do.
    Anyway, I am still curiouis about why this happens, but I can live with the rather bizarre method of using the Print>Color Sych method.
    Thanks to all of you for your help, really,
    Peter

  • Huge file size for .ai and .eps

    Recently I upgraded fo Illustrator CS6 and found something that is going to screw up my working with clients. The problem is that the .ai and .eps files are having really huge size (67MB on an average). I have used about 6 images (jpeg at 1024x768, less than 1MB each) made them as pattern in the swatches panel and applied on various shapes. The .ai file size is 109MB :O :O :O . Another ridiculous thing is that an art that contains only 3 lines of text plus some circles and squares (3 circles, 2 squares) filled with solid color is around 68MB.
    This is really bad when I have to send the files to the clients. Please tell me how to reduce the file size without compromising on the pattern quality. Please, this is quite urgent.

    Dang! This is really weird. The files I was referring to were for iPad (artboard size 2048x1536). I opened a new doc @ 800x600 and copy pasted everything in the other file and saved it - 109MB file reduced to 402KB jaw dropper!!! I don't know the logic behind it, but I guess it has to be something related to artboard size ...
    Thanks guys.

  • Incredibly strange file size discrepancy only appears in image files (jpg, gif, png)!

    I'm creating a bunch of banners for google ads, yahoo ads, ...etc in Photoshop on my Mac OS!
    The .gif files of these banners appear to TRIPLE in size when on the MAC (>150KB), but when transferred to windows; the real file size shows correctly! (<50KB)
    It is not a result of the base2 vs base10 discrepancy since the difference in size is simply too big, and it only happens with files created on Photoshop on my Mac.
    The reason I know that windows is showing the correct size while my Mac OS is displaying the wrong size, is that the file gets approved by google and yahoo ads, even though Mac OS shows that it surpasses the size limit (50KB) three times over!
    This isn't an isolated incident either, all image files created in Photoshop on the MAC continue this weird behaviour! However, files downloaded from the net appear to be consistent on both operating systems!
    One example is the attached screenshot:
    Explanation, please??

    Geez, sorry I offended you Mr. Jobs (incarnate)!
    You came in here with a three ton chip on your shoulders. Did you really expect sunshine and puppies in return?
    No, I expected useful help, and I got it from Jeffrey Jones. Thanks again Jeffrey!
    I mean, when you move or upload it, it loses this data association anyways!
    To a drive which doesn't support Apple's AB tree structure (NTFS, FAT, FAT32, exFAT), yes. To another HFS+ drive, no.
    What about uploading the file to the cloud?? Does it lose this association or not?? And does anyone really care about the data in the Resource fork?
    This "Resource Fork" means nothing to the file owner, only to the OS and the Drive. Therefore, it shouldn't be added to the total. Period. Because its not part of the file, its part of Apple's tree structure! This is really a simple concept, not sure why you are bending over backwords to defend a clearly stupid oversight from apple!
    There's no reason to force me to use the command line to get the real file size of a GIF! There's just no excuse for that!
    If an OS is saying it is fetching file size information for a single file, it should do exactly that! Not add hidden Resource Forks that are part of the OS's internal workings
    OS X is fetching the file size. It's file size, not the way a different OS would report it.
    There is no such thing as it's file size. A file size is a file size, accross all platforms, on the cloud, wherever!
    A GIF file should have the same file size whether its on windows, linux, unix, darwin, freeBSD, or anything else. The only time its weight should vary is in outer space!
    That is why I'm surprised that they are breaking simple UI Design rules.
    The User Interface has nothing to do with the file structure of a drive.
    I don't care about the structure of the drive!! Neither should you, neither should the average user!
    A good UI should NOT concern the user with this! The average user doesn't care about these Resource Forks, and will never try to view them, therefore, there is no logical reason to add up their file sizes to the total size of each file, and then to make things worse, hide that fact! That only creates confusion, and it makes it so much harder for a designer like myself to view the REAL file sizes of my image files! Now, whenever I'm on my MAC, I will have to run command line scripts to be able to see if my GIF files (that I work on EVERYDAY) meet the file size quota, because Mac OS adds up hidden files that I have no use for and gives me the WRONG file size!
    Let's say this again: when you select a file and click get info, you should get the info for the file you selcted. Nothing more, nothing less! I don't care if the file structure creates an entire colony of hidden files, they should be completely hidden to me, and if not, the Get Info dialog box should at least give me two sizes, one is the REAL file size, and the other is the added up file size for the Resource Fork as well (although I can't think of any good reason why it should add up the Resource Fork size anyways)!
    do you think it is at all logical, that when you select 300 or so files, and click Get Info, that it open 300 windows at once each showing separate information for each file? Or does it make a lot more sense, intuitively, to get the total tally of all the files selected added up, without having to hold down shortcut keys when clicking them to do so?
    Yes, it is logical because that's what you, the user, told the OS to do. You wanted the Get Info data on 300 individual items. I don't know about you, but I avoid the menu bar as much as possible (your reference to avoiding shortcut keys). Command+I will always give you singular Get Info dialogue boxes.
    No, that's not what I told the OS to do. I selected 300 files cumulatively, therefore, I should get the cumulative info for all the selected files. That's just common sense. Every other OS seems to get this!
    And I'm hard pressed to find anyone who has found a use for having 300 get info boxes open at the same time. Therefore, that shouldn't be the default.
    Will you start defending apple's decision to stick with the one button mouse for all these years depriving us from the all important context menu as well?? There was absolutely no good reason to do that, just as there is absolutely no good reason to do this!

Maybe you are looking for

  • Apple Digital AV Adapter and iPAD 1

    Hi I cant seem to get my ipad to talk to my sony bravia via hdmi using the Apple Digital AV Adapter. I have read all the information and know iPad 1 dosent do mirroring. but i cant get my photos to show or a video i created. is there something i am m

  • Networking a Lacie External HD?

    I have three iMacs in my home, all are connected to the internet via a Netgear router. I recently bought a Lacie d2 Extreme External HD (triple connector) and I'm wondering how I can set it up as a networked drive for storage (not startup) -- and if

  • In- transit for PP.

    How do I find the quantity in-transit for the PP? This is to be added in a report. Can any one help me?

  • Retain value of component when navigate in Task Flow

    Hi, my english isn't very good I use jdveloper 11.1.1.3.0 I define a bounded task flow and 3 view in this btf to show 3 page fragment. in each view I put 2 Input Text component. the problem is when I navigate from one view to other view and back, the

  • Save button grayed out

    Kartz, any answer to this yet?  I have the same issue,  I open a file, make some changes, and the standard SAVE button is grayed out.  An asterik appears in the file name, so I know that means my change is acknowledged, but the Save button stays gray