Load large file sizes

Newbie as I am, I need to ask this expert forum
I'm experiencing a problem loading tracks I've recorded with my little portable digital recorder (Yamaha C24). It seems like when I get over a certain file size (> 200MB or such) Soundbooth don't load the track. I can play it in another player but Soundbooth's GUI is blank when I try to load it.
Does anybody have a clue on what I'm seeing here?

Thank you for the response!
I believe your comment may be misguided. My website is downscaling a full resolution GIF(1080x1000) down to 270x250. That downscaled image looks fantastic and full of detail at that reduced resolution. If your assessment of this situation was correct, that downscaled image would look blurry because "There are simply not enough pixels to store the information". That is not the case because it looks great!
I want to achieve the same results by resizing that full resolution manually(not automatically done by the website). How can I do this?

Similar Messages

  • Loading large files in Java Swing GUI

    Hello Everyone!
    I am trying to load large files(more then 70 MB of xml text) in a Java Swing GUI. I tried several approaches,
    1)Byte based loading whith a loop similar to
    pane.setText("");
                 InputStream file_reader = new BufferedInputStream(new FileInputStream
                           (file));
                 int BUFFER_SIZE = 4096;
                 byte[] buffer = new byte[BUFFER_SIZE];
                 int bytesRead;
                 String line;
                 while ((bytesRead = file_reader.read(buffer, 0, BUFFER_SIZE)) != -1)
                      line = new String(buffer, 0, bytesRead);
                      pane.append(line);
                 }But this is gives me unacceptable response times for large files and runs out of Java Heap memory.
    2) I read in several places that I could load only small chunks of the file at a time and when the user scrolls upwards or downwards the next/previous chunk is loaded , to achieve this I am guessing extensive manipulation for the ScrollBar in the JScrollPane will be needed or adding an external JScrollBar perhaps? Can anyone provide sample code for that approach? (Putting in mind that I am writting code for an editor so I will be needing to interact via clicks and mouse wheel roatation and keyboard buttons and so on...)
    If anyone can help me, post sample code or point me to useful links that deal with this issue or with writting code for editors in general I would be very grateful.
    Thank you in advance.

    Hi,
    I'm replying to your question from another thread.
    To handle large files I used the new IO libary. I'm trying to remember off the top of my head but the classes involved were the RandomAccessFile, FileChannel and MappedByteBuffer. The MappedByteBuffer was the best way for me to read and write to the file.
    When opening the file I had to scan through the contents of the file using a swing worker thread and progress monitor. Whilst doing this I indexed the file into managable chunks. I also created a cache to further optimise file access.
    In all it worked really well and I was suprised by the performance of the new IO libraries. I remember loading 1GB files and whilst having to wait a few seconds to perform the indexing you wouldn't know that the data for the JList was being retrieved from a file whilst the application was running.
    Good Luck,
    Martin.

  • Creating .pdf from a ppt or .doc in acrobat 9 resulting in strangely large file sizes

    I am printing to .pdf from Microsoft Word and Powerpoint (previously Office 07 - now Office 10) and the file sizes are sometimes doubling in size. Also, for some larger documents, multiple .pdfs are generating as if acrobat is choosing where to cut off a document due to large file size and generating secondary, tertiary, etc., docs to continue the .pdf. The documents have placed images (excel charts, etc) but they are not enormous - that is until they are being generated into a .pdf. I had hoped that when i upgraded to Office 10, this issue would go away, but unfortunately, it hasn't. Any ideas? thoughts? Thanks so much!

    One thing that can happen with MS Office applications when an image contains transparency is the "image" is actually composed of multiple one pixel images when converted to PDF. This could be confirmed by examining the file. Try selecting the images using the TouchUp Object tool to see if what looks like a single image it entirely selected.

  • Video Stalls - Larger File Size

    Just downloaded Battlestar Galactica season 2 and the video stalls at full screen. Do not have this problem with previous Battlestar Galactica shows and movies that I have downloaded in the past.
    I notice the File Size and Total Bit Rate of the new shows are three times greater in size than the shows I've downloaded in the past.
    Using the utility ativity monitor, I notice I'm maxing out my CPU when I go to full screen which I think is causing my video stall.
    I also notice that now my previous TV show downloads don't fully fill my screen like that did before the new downloads.
    I hope someone might know how I can get my TV shows back to working normal at full screen and why the new TV shows file sizes are larger than before.
    Thanks for Your Help
    Ricky
    PowerBook G4 15" LCD 1.67GHz   Mac OS X (10.3.9)  

    Thanks for the reply. Shortly after I posted, I started to relize that Apple was offering better TV Show quality which results in larger file sizes. I also went to my Energy Saver section in System Prefrences and found my Optimize Energy Settings was set for Presentations. I changed it to Highest Performance which resolved the issue.
    iTunes 7.0.2
    Quicktime 7.1.3
    RAM 512MB
    Thanks for your help.

  • Aperture is exporting large file size e.g. original image is 18.2MB and the exported version (TFF 16 Bit) is 47.9MB, any ideas please

    Aperture is exporting large file size e.g. original image is 18.2MB and the exported version (TFF 16 Bit) is 47.9MB, any ideas please

    Raws, even if not compressed, Sould be smaller than a 24-bit Tiff, since they have only one bitplane. My T3i shoots 14 bit 18MP raws and has a raw file size of 24.5 MB*. An uncompressed Tiff should have a size of 18 MP x 3 bytes per pixel or 54 MB.
    *There must be some lossless compression going on since 18 MP times 1.75 bytes per pixel is 31.5MB for uncompressed raw.

  • Facebook application grow large file size

    Why facebook applications grow larger and larger file size. What can I do. Since this program is to use the file size is big time. I use facebook version 4.0.3 on iphone4 run on iOS 5.0.1.
    thank you.
    Eak.

    Most of the logs rotation for the different componets can be set in EM console but if you just want to do it with a cron or script here is what I have used in the past. Edit it to your needs.
    #!/bin/sh
    ORACLE_HOME=/home/oracle10FRS
    apache_logs=$ORACLE_HOME/Apache/Apache/logs
    webcache_logs=$ORACLE_HOME/webcache/logs
    sysman_logs=$ORACLE_HOME/sysman/log
    j2ee_logs=$ORACLE_HOME/j2ee/OC4J_BI_Forms/log/OC4J_BI_Forms_default_island_1
    also_j2ee=$ORACLE_HOME/j2ee/home/log/home_default_island_1
    max_size=1500000000
    for file in `ls $apache_logs/*_log`
    do
         size=`ls -l $file | awk '{print $5}'`
         echo "File $file is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$file
         fi
    done
    for wfile in `ls $webcache_logs/*_log`
    do
         size=`ls -l $wfile | awk '{print $5}'`
         echo "File $wfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$wfile
         fi
    done
    for sfile in `ls $sysman_logs/*log`
    do
         size=`ls -l $sfile | awk '{print $5}'`
         echo "File $sfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$sfile
         fi
    done
    for jfile in `ls $j2ee_logs/*.log`
    do
         size=`ls -l $jfile | awk '{print $5}'`
         echo "File $jfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$jfile
         fi
    done
    for j2file in `ls $also_j2ee/*.log`
    do
         size=`ls -l $j2file | awk '{print $5}'`
         echo "File $j2file is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$j2file
         fi
    done

  • Need to get large file sizes down to downloadable sizes-how?

    I am using Compressor 4 and have several professional videos I produced with Final Cut Studio in QuickTime/.MOV format using ProRes 442. They are all 720p and of course, all have large file sizes. I needed to convert them so they are compatible with iTunes as well as Apple devices (AppleTV, iPad/iPod/iPhone, etc). They also need to be downloadable in a reasonable time and I of course want to maintain a quality image. Several of the videos are about an hour long, some a little more.
    I an unable to get most of these files to be small enough for practical download and still maintain an acceptable quality. For example:
    I have one video (QuickTime rendered from FCPS) that runs 54 minutes. 1280x720, ProRess 422, Linear PCM with a Bit rate of 72.662. The best I seem to be able to get out of compressor is to use the setting options for "HD for Apple Devices (5 Mbs)" which gives me a file that's just over 2Gb. Not bad, and the quality is still very good, but the file size is just too big for people to download. I get reports of people getting estimates of 77 hours to download this one file.
    I did try using the setting under Video Sharing Services / Large 540p Video Sharing just to see what it would give me, but it really wasn't that much smaller and the quality was noticeably degraded.
    I'm confused as to what to do about this. I have downloaded things from iTunes before that are good quality, about an hour and their file sizes are simply bnot this big.
    Can anyone give me some advice? Thank you. Grateful for anything you can offer.
    MC

    If you're not concerned with precise average bitrate, use h.264 codec, set bitrate to "automatic" and use the quality slider. Be sure to have "keyframes" on "automatic"; every 24 frames or so makes h.264 quite inefficient and is only useful for live-streaming, where you want people to be able to resynchronise as quickly as possible. Having multi-pass enabled might help, but I'm not sure what it does when using constant-quality encoding.
    Do a test-encoding of a representative clip of maybe about rwo minutes or so, play with the quality slider and see how far down you can go and still be satisfied with the quality. Using "automatic" bitrate uses as many bits to encode each frame as is necessary to get the required quality, and file sizes will vary depending on the type of material. Complex textures and high motion will use more bits, still frames and slow pans will use very little.
    A light chroma-channel denoising may help improve perceived quality, as may very light sharpening in case you scaled down the material.
    If you're still not happy with the bitrate vs. quality tradeoff, try reducing the resolution, but generally you should get 720p down to around 2.5 to 3.0 Mbps with adequate quality.
    On the other hand, If someone needs 77 hours to download 2 GB, they have another problem anyway. That sounds like a 56k modem connection. What's he doing trying to download HD content anyway?
    Bernd

  • Saving jpeg and png files large file size

    Ive recently purchased web premium 5.5 and Im trying to save files in jpeg and png format in Photoshop and Im getting unexpectedly large file sizes, for example I save a simple logo image 246x48 which consists of only 3 basic colours and when I save it as a jpeg quality 11 it reports the file size in the finder as 61.6k, I save it as a png file and its 60.2k also I cropped the image to 190x48 and saved it as a png and the file size is actually larger at 61.9k saving as a jpeg at quality 7 the files size is a still relatively large 54k.
    I have a similar png non indexed colour logo on my mac I saved in CS3 on a pc which is actually larger at 260x148 and it's only 17k and that logo is actually more complex with a gloss effect over part of it. Whats going on and how do I fix it, It's making Photoshop useless especially when saving for the web
    Thanks

    Thanks I had considered that but all my old files are reporting the correct files sizes I have been experimenting and fireworks saves the file at png 24 at 2.6k and jpegs at 5.1k, but I don't really want to have to save the files twice once cut from the comp in photoshop and again in fireworks juggling between the two applications is a bit inconvenient even with just one file but especially when you have potentially hundreds of files to save.
    Ive also turned off icon and windows thumbnail in photoshop preferences and although this has decreased the file size they are still quite large at 27k and save for the web is better at 4k for the png and 16k for the jpeg. Is there anyway to get Fireworks file saving performance in Photoshop ? it seems strange that the compression in Photoshop would be second rate in comparison to fireworks given they are both developed by Adobe and Photoshop is Adobes primary image editing software.

  • Does FW CS5 Mac crap out w large file sizes?

    Does FW CS5 Mac crap out with large file sizes like CS4 did? It seems like files over a 4-5MB tend to be flaky... Is there an optimum file size? I'm running a Mac Tower 2.93 GHz Quad Core, w 8 GB RAM

    Why not download the trial version and find out?

  • (urgent) SQL*Loader Large file support in O734

    hi there,
    i have the following sqlloader error when trying to upload data file(s),
    each has size 10G - 20G to Oracle 734 DB on SunOS 5.6 .
    >>
    SQL*Loader-500: Unable to open file (..... /tstt.dat)
    SVR4 Error: 79: Value too large for defined data type
    <<
    i know there's bug fix for large file support in Oracle 8 -
    >>
    Oracle supports files over 2GB for the oracle executable.
    Contact Worldwide Support for information about fixes for bug 508304,
    which will add large file support for imp, exp, and sqlldr
    <<
    however, really want to know if any fix for Oracle 734 ?
    thx.

    Example
    Control file
    C:\DOCUME~1\MAMOHI~1>type dept.ctl
    load data
    infile dept.dat
    into table dept
    append
    fields terminated by ',' optionally enclosed by '"'
    trailing nullcols
    (deptno integer external,
    dname char,
    loc char)
    Data file
    C:\DOCUME~1\MAMOHI~1>type dept.dat
    50,IT,VIKARABAD
    60,INVENTORY,NIZAMABAD
    C:\DOCUME~1\MAMOHI~1>
    C:\DOCUME~1\MAMOHI~1>dir dept.*
    Volume in drive C has no label.
    Volume Serial Number is 9CCC-A1AF
    Directory of C:\DOCUME~1\MAMOHI~1
    09/21/2006  08:33 AM               177 dept.ctl
    04/05/2007  12:17 PM                41 dept.dat
                   2 File(s)          8,043 bytes
                   0 Dir(s)   1,165 bytes free
    Intelligent sqlldr command
    C:\DOCUME~1\MAMOHI~1>sqlldr userid=hary/hary control=dept.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Thu Apr 5 12:18:26 2007
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 2
    C:\DOCUME~1\MAMOHI~1>sqlplus hary/hary
    SQL*Plus: Release 10.2.0.1.0 - Production on Thu Apr 5 12:18:37 2007
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    As I am appending I got two extra rows. One department in your district and another in my district :)
    SQL> select * from dept;
        DEPTNO DNAME          LOC
            10 ACCOUNTING     NEW YORK
            20 RESEARCH       DALLAS
            30 SALES          CHICAGO
            40 OPERATIONS     BOSTON
            50 IT             VIKARABAD
            60 INVENTORY      NIZAMABAD
    6 rows selected.
    SQL>

  • Large file size and fuzzy type

    I'm new at using FCE2 and composed my first short 4 minute video. It includes a still image, a PSD layered file which I discovered was quite handy, a few clips and type using the type generator. I exported it to QuickTime movie. Three observations: I was floored by the nearly 1GB size, fuzzy type, and the file on the hard drive says it's a FCE movie inclusive of the little FCE icon by the file name. I was expecting to see a QuickTime icon with file name and the type of file is a QuickTime movie. Is all this normal? I'm very disappointed in the fuzzy type. Oh, also, the still image became blurry. Why?? Just so you know, the still image was a special file 640x480 with a pixel aspect ratio of D4/D16 Anamorphic
    I saved the project under a new name and redid it taking out the Photoshop image, removing it from the bin also, and the new movie exported even larger, over a gig in size. Huh??

    Thanks for the reply Tom. After I posted the first time I went to the Finder, Get Info, and I saw that I could change the 'Opens with' to QuickTime and therefore it became a QuickTime file. About that Anamorphic business I read a 'how to' on dealing with images before bringing them into video. The tip says in the 'New' file dialogue box to choose 640x480 size and in the pull down menu at the bottom where you can choose the 'Pixel Aspect Ratio' it was suggest to use that Anamorphic setting. I did it but it certainly didn't look right but I went with it.
    Again after I posted I looked at the Format of one of the clips and saw the size to be 720x480, Compressor is DV/DVCPRO-NTSC, Pixel Aspect is NTSC-CCIR 601, Anamorphic field is blank. I'm running Photoshop CS2. So I went back there and created a new blank file to use a template for dealing with stills but this time I used the Preset pull down menu and chose NTSC DV 720x480 with guides and the Pixel Aspect Ratio automatically loaded D1/DV NTSC (0.9), clicked Ok and viola, the blank file looks exactly like the Canvas in FCE. I haven't played with a still with this new setting but I will try it on the little project I'm working on.
    As for viewing it, I am looking at it on my Mac flat screen. I went into QuickTime Preferences and checked the box for high quality, thank you. Thanks for reassuring me on the file size.
    I also don't know what "D4/D16 Anamorphic" means.
    I don't understand the fuzzy type. I'm aware these are 72 ppi files and video is not resolution dependent but rather pixel dependent. Computer monitors display at 72 ppi, televisions are higher. I have yet to complete the process of burning a DVD and playing it back on a TV. Maybe that's where I'll see the type showing up sharper.
    At any rate, just dealing with this itty bitty project tells me I have a lot to learn about video, never mind learning about how to use FCE as well.

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Flash media server taking forever to load large files

    We purchased FMIS and we are encoding large 15+ hour MP4 recordings using flash media encoder. When opening these large files for playback, which have not been opened recently  the player displays the loading indicator for up to 4 minutes! Once it has apparently been cached on the server it opens immediately from any browser even after clearing local browser cache. So a few questions for the experts
    1. Why is it taking so long to load the file. Is it because the MP4 metadata is in the wrong format and the file is so huge? I read somewhere that Media Encoder records with incorrect MP4 metadata is that still the case?
    2. Once its cached on the server, exactly how much of it is cached. Some of these files are larger than 500mb.
    3. What fms settings do you suggest I change. FMIS is running on windows server R2 64 bit, but FMIS itself is 32 bit. We have not upgraded to the 64 bit version. We have 8GB of ram. Is it OK to set FMS cache to 3GB. And would that only have enough room for 3-4 large files, because we have hundreds of them.
    best,
    Tuviah
    Lead programmer, solid state logic inc

    Hi Tuviah,
    You may want to email me offline about more questions here as it can get a little specific but I'll hit the general problems here.
    MP4 is a fine format, and I won't speak ill of it, but it does have weaknesses.  In FMS implementation those weaknesses tend to manifest around the combination of recording and very large files, so some of these things are a known issue.
    The problem is that MP4 recording is achieved through what's called MP4 fragmentation.  It's a part of the MP4 spec that not every vendor supports, but has a very particular purpose, namely the ability to continually grow an MP4 style file efficiently.  Without fragments one has the problem that a large file must be constantly rewritten as a whole for updating the MOOV box (index of files) - fragments allow simple appending.  In other words it's tricky to make mp4 recording scalable (like for a server ) and still have the basic MP4 format - so fragments.
    There's a tradeoff to this however, in that the index of the file is broken up over the whole file.  Also likely these large files are tucked away on a NAS for you or something similar.  Normal as you likely can't store all of them locally.  However that has the bad combo of needing to index the file (touching parts of the whole thing) and doing network reads to do it.  This is likely the cause of the long delay you're facing - here are some things you can do to help.
    1. Post process the F4V/MP4 files into non fragmented format - this may help significantly in load time, though it could still be considered slow it should increase in speed.  Cheap to try it out on a few files. (F4V and MP4 are the same thing for this purpose - so don't worry about the tool naming)
    http://www.adobe.com/products/flashmediaserver/tool_downloads/
    2. Alternatively this is why we created the raw: format.  For long recording mp4 is just unideal and raw format solves many of the problems involved in doing this kind of recording.  Check it out
    http://help.adobe.com/en_US/flashmediaserver/devguide/WSecdb3a64785bec8751534fae12a16ad027 7-8000.html
    3. You may also want to check out FMS HTTP Dynamic Streaming - it also solves this problem, along with others like content protection and DVR and it's our most recent offering in tech, so it has a lot of strengths the other areas don't.
    http://www.adobe.com/products/httpdynamicstreaming/
    Hope that helps,
    Asa

  • Is it best to upload HD movies from a camcorder to iMovie or iPhoto.  iMovie gives the option for very large file sizes - presumably it is best to use this file size because the HD movie is then at its best quality?

    Is it best to upload hd movie to iPhoto or iMovie?  iMovie seems to store the movie in much larger files - presumably this is preferable as a larger file is better quality?

    Generally it is if you're comparing identical compressors & resolutions but there's something else happening here.  If you're worried about quality degrading; check the original file details on the camera (or card) either with Get Info or by opening in QuickTime and showing info. You should find the iPhoto version (reveal in a Finder) is a straight copy.  You can't really increase image quality of a movie (barring a few tricks) by increasing file size but Apple editing products create a more "scrub-able" intermediate file which is quite large.
    Good luck and happy editing.

  • Getting large file sizes in AppleScript...

    For starters I am new to AppleScript. Please excuse my lack of knowledge.
    I am trying to get file sizes for large files and kicking that out to a text file. Problem is that all these files are a gigabyte and up. When I use:
    set fileSIZE to size of (info for chosenFile) as string
    --result is 1.709683486E+9
    I also tried using:
    tell application "Finder"
    set theSize to physical size of chosenFile
    end tell
    --result is 1.724383232E+9
    So, my question is there another way that shows me the bytes in this format "1,709,683,486" bytes.
    The "size of (info for chosenFile)" shows the right numbers "1.709683486". I don't really need the commas. I had a larger file of 44GB and the result was 4.4380597717E+10.
    How do I remove the ".", the "E+9" or "E+10"?

    looks like hubionmac has found a winner.
    perhaps not:
    page 89 applescript language guide:
    the largest integer value is 536,870,911. Larger integers are convert to real numbers.
    Notice the different request for size report different values.
    Notice that ls -l return the data fork size in Tiger.
    Notice the clever way of working with the resource fork in unix ( /rsrc ). I found this in juliejuliejulie's code in another post.
    set theFile to (choose file)
    set fileSIZE to size of (info for theFile) as miles as string
    log "fileSIZE = " & fileSIZE
    tell application "Finder"
       set pSize to physical size of theFile
    end tell
    log "pSize = " & pSize
    set stringSize to pSize as miles as string
    log "stringSize = " & stringSize
    set theItem to quoted form of POSIX path of (theFile)
    log "theItem = " & theItem
    -- unix ls -l command will give size. 
    -- Looks like ls -l gives the data fork size.
    set theDataSize to (do shell script "ls -l " & theItem & " | awk '{print $5}'")
    log "theDataSize = " & theDataSize
    set theRsrc to (POSIX path of theFile) & "/rsrc"
    log "theRsrc = " & theRsrc
    set theRsrc to quoted form of theRsrc
    log "theRsrc = " & theRsrc
    set theRsrcSize to (do shell script "ls -l " & theRsrc & " | awk '{print $5}'")
    log "theRsrcSize = " & theRsrcSize
    set output to do shell script "echo \"" & theDataSize & "+" & theRsrcSize & "\" | bc "
    log "combined data and resource size = " & output
    Here is what I get when I run the above script.
    tell current application
       choose file
          alias "Macintosh-HD:System Folder:Finder"
       info for alias "Macintosh-HD:System Folder:Finder"
          {name:"Finder", creation date:date "Tuesday, May 29, 2001 3:00:00 PM", modification date:date "Tuesday, May 29, 2001 3:00:00 PM", icon position:{1, 128}, size:2.439365E+6, folder:false, alias:false, package folder:false, visible:true, extension hidden:false, name extension:missing value, displayed name:"Finder", default application:alias "Macintosh-HD:Applications (Mac OS 9):Utilities:Assistants:Setup Assistant:Setup Assistant", kind:"Finder", file type:"FNDR", file creator:"MACS", type identifier:"dyn.agk8yqxwenk", locked:false, busy status:false, short version:"9.2", long version:"9.2, Copyright Apple Computer, Inc. 1983-2001"}
       (*fileSIZE = 2439365*)
    end tell
    tell application "Finder"
       get physical size of alias "Macintosh-HD:System Folder:Finder"
          2.445312E+6
       (*pSize = 2.445312E+6*)
       (*stringSize = 2445312*)
       (*theItem = '/System Folder/Finder'*)
    end tell
    tell current application
       do shell script "ls -l '/System Folder/Finder' | awk '{print $5}'"
          "1914636"
       (*theDataSize = 1914636*)
       (*theRsrc = /System Folder/Finder/rsrc*)
       (*theRsrc = '/System Folder/Finder/rsrc'*)
       do shell script "ls -l '/System Folder/Finder/rsrc' | awk '{print $5}'"
          "524729"
       (*theRsrcSize = 524729*)
       do shell script "echo \"1914636+524729\" | bc "
          "2439365"
       (*combined data and resource size = 2439365*)
    end tell
    Message was edited by: rccharles

Maybe you are looking for

  • Yahoo thumbs up (and down) not working correctly, but do on Safari

    Yahoo news sites offer a thumbs up or down to support or not a comment from a posting. In many cases this feature is locked out with Firefox but it works with safari at the same moment on the same comment. I am on a intel macbook pro running 10.6.8.

  • CANT SYNC MY IPOD TOUCH SINCE I JOINED MATCH

    SINCE I SUBSCRIBED TO THE CLOUD I CANT SYNC MY IPOD TOUCH TO MY COMPUTER. I ALSO CANNOT PLAY SONGS IF NOT CONNECTED TO WIFI. HELP!

  • My default user can't access internet

    when I tried to use the internet from my default user account, it refused to connect, but my secondary user has no issues. how do i fix my default account?

  • Data Scaling Error

    Hi, I am working through the tutorial file for NI Lookout 6.2. I have attached a screenshot of a Data Scaling Error problem  following completion of the section "Configure Data for Logging, Scaling and Alarms" on page 8 of the tutorial. In the attach

  • How to run report SDSPESTA?

    Hi, While executing VA14l (Sales and Distribution Documents Blocked for Delivery ), the message populated as follows "Execute report SDSPESTA first" How this report can be run? Please guide... Regards