Cheat for pulishing large files to .Mac!

OK, so someone may have already posted this, but it's suddenly made my life MUCH easier!
Having had a week of trouble uploading large podcasts to .Mac from iWeb and via Goliath, I've found a quick cheat.
- create your Entries with dummy files of 2" blank audio/video - name it something relevant to the podcast entry so you will recognise it later.
- publish site (should be much quicker and easier).
- find a relevant podcast folder on your iDisk.
- upload your larger file to the relevant folder. I found Goliath more stable than Finder.
- delete small file from the folder.
- re-name larger file with exact capitalisation etc as smaller file. Goliath was also better for re-naming, Finder had a habit of hanging.
- bingo. You can listen to/watch the larger files when you visit your site and subscription seems to work too.
It's working so far. Just be aware that if you amend those Entries in iWeb, the small files will re-appear after you publish, and you have to upload the larger ones again.
Hope that helps
15" Powerbook   Mac OS X (10.4.3)   1.67 GHz | 1GB RAM

Cyberduck doesn't seem to do AFP; I can use SFTP, but does that preserve all Apple file properties the way AFP copy does?

Similar Messages

  • Java proxies for handling large files

    Dear all,
    Kindly let me know handle the same in step by step explanation as i do not know much about java.
    what is advantage of using the java proxies here.Do we implement the split logic in java code for mandling 600mb file?
    please mail me the same to [email protected]

    Hi !!   Srinivas
    Check out this blog....for   Large file handling issue  
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    This will help you
    Please see the documents below. This might help you
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    /people/prasad.ulagappan2/blog/2005/06/27/asynchronous-inbound-java-proxy
    /people/rashmi.ramalingam2/blog/2005/06/25/an-illustration-of-java-server-proxy
    We can also find them on your XI/PI server in folders:
    aii_proxy_xirt.jar
    j2eeclusterserver0 inextcom.sap.aii.proxy.xiruntime
    aii_msg_runtime.jar
    j2eeclusterserver0 inextcom.sap.aii.messaging.runtime
    aii_utilxi_misc.jar
    j2eeclusterserver0 inextcom.sap.xi.util.misc
    guidgenerator.jar
    j2eeclusterserver0 inextcom.sap.guid
    Java Proxy
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    Pls reward if useful

  • Processing large files on Mac OS X Lion

    Hi All,
    I need to process large files (few GB) from a measurement. The data files contain lists of measured events. I process them event by event and the result is relatively small and does not occupy much memory. The problem I am facing is that Lion "thinks" that I want to use the large data files later again and puts them into cache (inactive memory). The inactive memory is growing during the reading of the datafiles up to a point where the whole memory is full (8GB on MacBook Pro mid 2010) and it starts swapping a lot. That of course slows down the computer considerably including the process that reads the data.
    If I run "purge" command in Terminal, the inactive memory is cleared and it starts to be more responsive again. The question is: is there any way how to prevent Lion to start pushing running programs from memory into the swap on cost of useless harddrive cache?
    Thanks for suggestions.

    It's been a while but I recall using the "dd" command ("man dd" for info) to copy specific portions of data from one disk, device or file to another (in 512 byte increments).  You might be able to use it in a script to fetch parts of your larger file as you need them, and dd can be used to throw data from and/or to standard input/output so it's easy to get data and store in temporary container like a file or even a variable.
    Otherwise if you can afford it, and you might with 8 GB or RAM, you could try and disable swapping (paging to disk) alltogether and see if that helps...
    To disable paging, run the following command (in one line) in Terminal and reboot:
    sudo launchctl unload -w /System/Library/LaunchDaemons/com.apple.dynamic_pager.plist
    To re-enable paging, run the following command (in one line) in Terminal:
    sudo launchctl load -w /System/Library/LaunchDaemons/com.apple.dynamic_pager.plist
    Hope this helps!

  • Best compression for exporting large files from iMovie for use in iDVD?

    I have a doco that I've made as separate projects too slow to make it all in one project - which I then exported as Quicktime .mov files. When I assembled the .mov files in iDVD I ended up with 4.7GB of material which iDVD flipped out on and crashed.
    What would be the best compression to apply to the large .mov files but reduce the overall file sizes by say 10%
    Its sometimes taking up to 6 hours to render each project average total screentime per project is about 70 minutes and each project file size ranges from 1 to 6GB...
    I feel like I'm a little out of my depth on this one!
    Any suggestions for this dilemma?
    Thanks
    Tony

    When I assembled the .mov files in iDVD I ended up with 4.7GB of material which iDVD flipped out on and crashed... Any suggestions for this dilemma?
    Not sure if your 4.7GB reference is for the source files or the resulting "muxed" MPEG2/PCM encoded content. In any case a single layer DVD is limited to 4.7 GBs of total content. To be safe, this usually means you are limited to 4.2 to 4.3 GB of encoded movie and/or slideshow content depending on how many and what type of menus you are creating created. Thus, if you are encoding for a single-layer DVD and your project encoded content exceeds the capacity of your DVD, then this would likely explain your problem.
    If your project size reference is for the source content being added to your iDVD project, then it is totally irrelevant and you need to concentrate on the total duration of your iDVD and the method of encoding instead of file size. Basically, for a single-layer DVD, only about 60 minutes content can be encoded using the "Best Performance" encode option, 90 minutes using the "High Quality" setting, or 2 hours in the "Professional Quality" encode mode. Thus, if your content duration exceeds the capacity of your encoding mode here, it would likely explain your problem.
    In either case, the solution would be to either burn to higher capacity optical media/higher capacity encoding mode or burn your content to multiple DVDs as individual projects at the current media capacity/encoding mode.

  • Is it best to upload HD movies from a camcorder to iMovie or iPhoto.  iMovie gives the option for very large file sizes - presumably it is best to use this file size because the HD movie is then at its best quality?

    Is it best to upload hd movie to iPhoto or iMovie?  iMovie seems to store the movie in much larger files - presumably this is preferable as a larger file is better quality?

    Generally it is if you're comparing identical compressors & resolutions but there's something else happening here.  If you're worried about quality degrading; check the original file details on the camera (or card) either with Get Info or by opening in QuickTime and showing info. You should find the iPhoto version (reveal in a Finder) is a straight copy.  You can't really increase image quality of a movie (barring a few tricks) by increasing file size but Apple editing products create a more "scrub-able" intermediate file which is quite large.
    Good luck and happy editing.

  • I need a good program for merging MP4 files on Mac, i need a program that wont mess up the sound?

    Hey, pal. If you are using a Mac, I think aprofessional MP4 video joiner for Mac would be the best choice, which cancombine MP4 Video and output to iPod (also to Quicktime, iMovie, iPhone...).
    It was recommended by lots of famous mac-relatedsites, such as macworld.com, macnn.com, maclife.com, etc. You will never knowhow powerful it is. It’s the best all-in-one Mac converter I have used ever.
    Glad to share. And you can find the truthon  official website:http://imp4converter.com/mp4joiner_mac.html

    ??? Sorry, my knoledge does not go that far! By bouncing you mean drop into iTunes? Remember some of the files are not splitted into tracks...Thanks!

  • Xgrid for gzipping large files?

    Hi there. I'm totally new to Xgrid, and I'm wondering if it can take a load off of my servers. Currently my users are pushing large datasets (50GB-2TB, typically segmented into 2GB files) to a sharepint, then gzipping to a single file via ssh. I'd love to off-load this chore to a grid...either on a scheduled/automated basis, or via a trigger of some sort when the user is ready. I've looked over some of the info that's available online, and I'm afraid I haven't been able to figure out if this is even possible, and if so, how to go about the implementing such a system. We have loads of systems with spare cycles that I'd love to leverage if possible. Thanks for any input or direction... chris

    I'm running an Xgrid, but not (yet) developing software for it. (We're using it for medial imaging.)
    Xgrid works best for stuff that can be split into small junks. So if you regard your 2GB files as "small junks" of a larger dataset you could distribute those junks to have your clients compress on junk each and hence speed up the whole process in total.
    So your Xgrid Controller would have to send the junks for processing to your agents and receive the results (gzipped file).
    Regarding the size of your datasets this would probably wreak havoc to your network.
    Just out of curiosity:
    If you may disclose details, what are your datasets about?
    MacLemon

  • Import hangs for (very) large file

    Hi All,
    where importing a file that is converted to a single data column in the beffileimport script.
    So i have a flatfile with 250.000 rows but 4 data columns and i convert it to a file with 1.000.000 rows and 1 data column.
    In the inbox i can see this process takes about 12 minutes. Which is long but not a problem for us.
    After that the FDM web client (11.1.1.3) keeps showing processing, please wait... (even after a day).
    We saw that there was no activitiy on the server so closed the web client, logged on againg and the import action was succesfull..
    So for some reason because the import step takes so long it doesn't show a success message but keeps hanging.
    Any suggestions how to solve this?
    thanks in advance,
    Marc

    The only thing that I would be aware of that would cause this is inserting a timeout Registry key on the workstation as noted in the following microsoft KB. If you have this in place, remove it and reboot and then test the import again.
    http://support.microsoft.com/kb/181050

  • Instructions for moving iTunes/files from Mac to Mac?

    I want to move my iTunes acct & files from my PBG4 to my iMac. Where can I get instructions & guidance to do this?

    Go to the top of this page, click "Support", type "move itunes library" in the search bar.
    You can also use Home Share. Click "Support", type "Hame Share" in the search bar.
    You can use Migration assistant, comes with every Mac.
    You can use target mode, click "Support", then type "Target Mode" in the search bar.

  • When I use Image Trace for a large file, it keeps deleting one corner of the file.

    I've tried several tracing options and it keeps happening. The image was created in Illustrator, but I need it all on one layer,
    so I took it into Photoshop.  Now back to Illustrator because the fabricator needs that format.  Reduced the image size, same problem.
    Does anyone have any ideas?

    without seeing the artwork i'm not sure how I would do it.
    I might try a script, Turn selected AI sublayers into top-level layers?
    or it may be as simple as selecting all and doing a pathfinder unite. but that may or may not cause big issues.
    but if it is just a hand full of shapes for a water jet to cut out then that may be a very simple way to fix your problem.

  • Recommended Structure for Large Files

    I am working at re-familiarizing myself with Oracle development and management so please forgive my ignorance on some of these topics or questions. I will be working with a client who is planning a large-scale database for what is called "Flow Cytometry" data which will be linked to research publications. The actual data files (FCS) and various text, tab-delimited and XML files will all be provided by researchers in a wrapper or zip-container which will be parsed by some as-yet-to-be-developed tools. For the most part, data will consist of a large FCS file containing the actual Flow Cytometry Data, along with various/accompanying text/XML files containing the metadata (experiment details, equipment, reagents etc). What is most important is the metadata which will be used to search for experiments etc. For the most part the actual FCS data files (up to 100-300 mb), will only need to be linked (stored as BLOB's?) to the metadata and their content will be used at a later time for actual analysis.
    1: Since the actual FCs files are large, and may not initially be parsed and imported into the DB for later analysis, how best can/should Oracle be configured/partitioned so that a larger/direct attached storage drive/partition can be used for the large files so as not to take up space where the actual running instance of Oracle is installed? We are expecting around 1TB of data files initially
    2: Are there any on-line resources which might be of value to such an implementation?

    Large files can be stored as BFILE datatypes. The data need not be transferred to Oracle tablespaces and the files will reside in OS.
    It is also possible to index bfiles using Oracle text indexing.
    http://www.oracle-base.com/articles/9i/FullTextIndexingUsingOracleText9i.php
    http://www.stanford.edu/dept/itss/docs/oracle/10g/text.101/b10730/cdatadic.htm
    http://www.idevelopment.info/data/Oracle/DBA_tips/Oracle_Text/TEXT_3.shtml
    Mohan
    http://www.myoracleguide.com/

  • Partition External HD to store large files

    Can I partition my WD My Book for Mac external hard drive so that I can place large files such as movies from my Macbook Pro, to the newly created partiton on the external HD, then delete it off my MacBook Pro? I assume I would need to format it Mac OS Extended (Journaled)?  Also, I use time machine to backup the Macbook Pro to the external HD, so I have not downloaded the WD Smartware software.  Without downloading WD's software, can I manually place and retrieve the large files using the Macbook Pro's Finder I'm assumiung? And how difficult would it be?
    Thanks in advance,
    AndrewMx17

    So I was able to shrink my partition on the exteral HD that I was using Time Machine to backup to so I wouldn't lose my backup, and create another partiton for my large files I would like to store, then use Finder to place and retrieve files.  One thing I did encounter was the partition I was using to backup the Macbook was found to be corrupt but the 'First Aid' tool in Disk Utility was able to fix it so I could partition the drive.

  • Windows Explorer misreads large-file .zip archives

       I just spent about 90 minutes trying to report this problem through
    the normal support channels with no useful result, so, in desperation,
    I'm trying here, in the hope that someone can direct this report to some
    useful place.
       There appears to be a bug in the .zip archive reader used by Windows
    Explorer in Windows 7 (and up, most likely).
       An Info-ZIP Zip user recently reported a problem with an archive
    created using our Zip program.  The archive was valid, but it contained
    a file which was larger than 4GiB.  The complaint was that Windows
    Explorer displayed (and, apparently believed) an absurdly large size
    value for this large-file archive member.  We have since reproduced the
    problem.
       The original .zip archive format includes uncompressed and compressed
    sizes for archive members (files), and these sizes were stored in 32-bit
    fields.  This caused problems for files which are larger than 4GiB (or,
    on some system types, where signed size values were used, 2GiB).  The
    solution to this fundamental limitation was to extend the .zip archive
    format to allow storage of 64-bit member sizes, when necessary.  (PKWARE
    identifies this format extension as "Zip64".)
       The .zip archive format includes a mechanism, the "Extra Field", for
    storing various kinds of metadata which had no place in the normal
    archive file headers.  Examples include OS-specific file-attribute data,
    such as Finder info and extended attributes for Apple Macintosh; record
    format, record size, and record type data for VMS/OpenVMS; universal
    file times and/or UID/GID for UNIX(-like) systems; and so on.  The Extra
    Field is where the 64-bit member sizes are stored, when the fixed 32-bit
    size fields are too small.
       An Extra Field has a structure which allows multiple types of extra
    data to be included.  It comprises one or more "Extra Blocks", each of
    which has the following structure:
           Size (bytes) | Description
          --------------+------------
                2       | Type code
                2       | Number of data bytes to follow
            (variable)  | Extra block data
       The problem with the .zip archive reader used by Windows Explorer is
    that it appears to expect the Extra Block which includes the 64-bit
    member sizes (type code = 0x0001) to be the first (or only) Extra Block
    in the Extra Field.  If some other Extra Block appears at the start of
    the Extra Field, then its (non-size) data are being incorrectly
    interpreted as the 64-bit sizes, while the actual 64-bit size data,
    further along in the Extra Field, are ignored.
       Perhaps the .zip archive _writer_ used by Windows Explorer always
    places the Extra Block with the 64-bit sizes in this special location,
    but the .zip specification does not demand any particular order or
    placement of Extra Blocks in the Extra Field, and other programs
    (Info-ZIP Zip, for example) should not be expected to abide by this
    artificial restriction.  For details, see section "4.5 Extensible data
    fields" in the PKWARE APPNOTE:
          http://www.pkware.com/documents/casestudies/APPNOTE.TXT
       A .zip archive reader is expected to consider the Extra Block type
    codes, and interpret accordingly the data which follow.  In particular,
    it's not sufficient to trust that any particular Extra Block will be the
    first one in the Extra Field.  It's generally safe to ignore any Extra
    Block whose type code is not recognized, but it's crucial to scan the
    Extra Field, identify each Extra Block, and handle it according to its
    type.
       Here are some relatively small (about 14MiB each) test archives which
    illustrate the problem:
          http://antinode.info/ftp/info-zip/ms_zip64/test_4g.zip
          http://antinode.info/ftp/info-zip/ms_zip64/test_4g_V.zip
          http://antinode.info/ftp/info-zip/ms_zip64/test_4g_W.zip
       Correct info, from UnZip 6.00 ("unzip -lv"):
    Archive:  test_4g.zip
     Length   Method    Size  Cmpr    Date    Time   CRC-32   Name
    4362076160  Defl:X 14800839 100% 05-01-2014 15:33 6d8d2ece  test_4g.txt
    Archive:  test_4g_V.zip
     Length   Method    Size  Cmpr    Date    Time   CRC-32   Name
    4362076160  Defl:X 14800839 100% 05-01-2014 15:33 6d8d2ece  test_4g.txt
    Archive:  test_4g_W.zip
     Length   Method    Size  Cmpr    Date    Time   CRC-32   Name
    4362076160  Defl:X 14800839 100% 05-01-2014 15:33 6d8d2ece  test_4g.txt
    (In these reports, "Length" is the uncompressed size; "Size" is the
    compressed size.)
       Incorrect info, from (Windows 7) Windows Explorer:
    Archive        Name          Compressed size   Size
    test_4g.zip    test_4g.txt         14,454 KB   562,951,376,907,238 KB
    test_4g_V.zip  test_4g.txt         14,454 KB   8,796,110,221,518 KB
    test_4g_W.zip  test_4g.txt         14,454 KB   1,464,940,363,777 KB
       Faced with these unrealistic sizes, Windows Explorer refuses to
    extract the member file, for lack of (petabytes of) free disk space.
       The archive test_4g.zip has the following Extra Blocks: universal
    time (type = 0x5455) and 64-bit sizes (type = 0x0001).  test_4g_V.zip
    has: PWWARE VMS (type = 0x000c) and 64-bit sizes (type = 0x0001).
    test_4g_W.zip has: NT security descriptor (type = 0x4453), universal
    time (type = 0x5455), and 64-bit sizes (type = 0x0001).  Obviously,
    Info-ZIP UnZip has no trouble correctly finding the 64-bit size info in
    these archives, but Windows Explorer is clearly confused.  (Note that
    "1,464,940,363,777 KB" translates to 0x0005545500000400 (bytes), and
    "0x00055455" looks exactly like the size, "0x0005" and the type code
    "0x5455" for a "UT" universal time Extra Block, which was present in
    that archive.  This is consistent with the hypothesis that the wrong
    data in the Extra Field are being interpreted as the 64-bit size data.)
       Without being able to see the source code involved here, it's hard to
    know exactly what it's doing wrong, but it does appear that the .zip
    reader used by Windows Explorer is using a very (too) simple-minded
    method to extract 64-bit size data from the Extra Field, causing it to
    get bad data from a properly formed archive.
       I suspect that the engineer involved will have little trouble finding
    and fixing the code which parses an Extra Field to extract the 64-bit
    sizes correctly, but if anyone has any questions, we'd be happy to help.
       For the Info-ZIP (http://info-zip.org/) team,
       Steven Schweda

    > We can't get the source (info-zip) program for test.
       I don't know why you would need to, but yes, you can:
          http://www.info-zip.org/
          ftp://ftp.info-zip.org/pub/infozip/src/
    You can also get pre-built executables for Windows:
          ftp://ftp.info-zip.org/pub/infozip/win32/unz600xn.exe
          ftp://ftp.info-zip.org/pub/infozip/win32/zip300xn.zip
    > In addition, since other zip application runs correctly. Since it should
    > be your software itself issue.
       You seem to misunderstand the situation.  The facts are these:
       1.  For your convenience, I've provided three test archives, each of
    which includes a file larger than 4GiB.  These archives are valid.
       2.  Info-ZIP UnZip (version 6.00 or newer) can process these archives
    correctly.  This is consistent with the fact that these archives are
    valid.
       3.  Programs from other vendors can process these archives correctly.
    I've supplied a screenshot showing one of them (7-Zip) doing so, as you
    requested.  This is consistent with the fact that these archives are
    valid.
       4.  Windows Explorer (on Windows 7) cannot process these archives
    correctly, apparently because it misreads the (Zip64) file size data.
    I've supplied a screenshot of Windows Explorer showing the bad file size
    it gets, and the failure that occurs when one tries to use it to extract
    the file from one of these archives, as you requested.  This is
    consistent with the fact that there's a bug in the .zip reader used by
    Windows Explorer.
       Yes, "other zip application runs correctly."  Info-ZIP UnZip runs
    correctly.  Only Windows Explorer does _not_ run correctly.

  • Need help exporting large file to bitmap format

    I have a large file that I need to export to a bitmap format and I am experiencing unknown "IMer" errors, memory and file size limitations.
    The file contains only one artboard of size around 8000 pixels by 6000 pixels. I can successfully export as a png at 72dpi giving me a 100% image of my AI file. However I need to export an image at 200% of the AI file. If I export as a png at 144dpi to achieve this I get the following error(s)
    "The operation cannot complete because of an unknown error (IMer)" or "Could not complete this operation. The rasterised image exceeded the maximum bounds for savung to the png format"
    I have tried exporting as tiff, bmp and jpeg but all give "not enough memory"  or "dimensions out of range" errors.
    Does anyone know the limitations with AI for saving large files to bitmap format?
    Does anyone have a workaround I could try?

    You could try printing to Adobe Postscript files.
    This will actually save a .ps file and you can scale it to 200% when doing so. You'll need to "print" for every image change.
    Then open a Photoshop document at your final size (16000x12000 pixels) and place or drag the .ps files into the Photoshop document.
    I just tested this and it seems to work fine here. But, of course, I don't have your art, so complexity of objects might factor in and I'm not seeing that here.
    And you are aware that Photoshop won't allow you to save a png at those dimension either, right? The image is too large for PNG. Perhaps you could explain why you specifically need a png that large???

  • Can't Copy Large Files To External Drive

    This is driving me nuts. I'm running 10.4.11 with an external Newer Tech Ministack 320GB hard drive that's only ten weeks old. Yesterday I downloaded some large (50mb or so) QuickTime files to my desktop, which is to say my internal drive. I then tried to copy them to my external drive, since that's where I keep my movie files since they take up so much room. One copied over without any trouble, but numerous attempts to copy the others result in the process freezing after anywhere from 1/4 to 3/4 of the data has been copied, or else I get the "error 36" message. My first thought was that the external drive was failing, but I can copy small movie files (10mb or so) back and forth between the two drives with no problem. What's the point of having a large external drive if you can't use it for your large files? Repaired permissions on both drives, no improvement.

    I have repaired permissions and run disk repair on the internal drive as well. Yes, I've played the clips in their entirety, but I'm going to do it again and look for any hiccups. That's an interesting theory and I want to check it out.
    It has occurred to me that a workaround, although it by no means addresses the underlying problem, is to use QuickTime Pro to chop the clips up into several smaller files, copy them to the external drive, and then reassemble them via QuickTime Pro on the external drive.
    FWIW, I neglected to mention (although I don't know if it's pertinent or not) that immediately after installing the external drive ten weeks ago I used SuperDuper to clone my entire internal drive onto the external firewire drive. I then erased the movies from the internal drive, which was bumping up against maximum capacity. I am able to boot from the external drive, and I want to retain this capacity should the internal drive (which is, after all, six and a half years old) fail.

Maybe you are looking for