Failing to print very large files

I've got an iMac running the newest Leopard, printing to a network printer. Printing usually works without incident. However, when printing large documents -- say a 20-40MB PowerPoint presentation with lots of color -- I receive the following message after 10-20 minutes of waiting:
/usr/libexec/cups/backend/socket failed
The printer is an HP Color LaserJet 8550DN with 96MB RAM and a 3.1GB internal HD.
When printing large documents to this printer, printing seems to proceed correctly (if slowly) before reaching the inevitable error message.
Possible causes would seem to be
--too large a print job for the printer (though lpq suggests that many of the jobs that failed were <50MB). There was no other network printing taking place on this printer when the jobs failed.
--some kind of timeout issue. How is the timeout value set? It doesn't appear to be in /etc/cups/cupsd.conf, at least at the moment.
Any other suggestions appreciated.
Thanks.

This info relates directly only to an Epson 9800. However, it may have applications elsewhere.
Generally, the print spooler in the 9800 with Photoshop will not print large files (>100 mb or so) unless you first convert them to the printer profile and shut down and restart the Mac and then print the converted image. This is true for Tiger and Leopard. In addition Leopard takes about twice as long to spool and since I print lots of 500 mb files, I have given up and gone back to Tiger (lots of other reasons also).

Similar Messages

  • Unable to copy very large file to eSATA external HDD

    I am trying to copy a VMWare Fusion virtual machine, 57 GB, from my Macbook Pro's laptop hard drive to an external, eSATA hard drive, which is attached through an ExpressPort adapter. VMWare Fusion is not running and the external drive has lots of room. The disk utility finds no problems with either drive. I have excluded both the external disk and the folder on my laptop hard drive that contains my virtual machine from my Time Machihne backups. At about the 42 GB mark, an error message appears:
    The Finder cannot complete the operation because some data in "Windows1-Snapshot6.vmem" could not be read or written. (Error code -36)
    After I press OK to remove the dialog, the copy does not continue, and I cannot cancel the copy. I have to force-quit the Finder to make the copy dialog go away before I can attempt the copy again. I've tried rebooting between attempts, still no luck. I have tried a total of 4 times now, exact same result at the exact same place, 42 GB / 57 GB.
    Any ideas?

    Still no breakthrough from Apple. They're telling me to terminate the VMWare processes before attempting the copy, but had they actually read my description of the problem first, they would have known that I already tried this. Hopefully they'll continue to investigate.
    From a correspondence with Tim, a support representative at Apple:
    Hi Tim,
    Thank you for getting back to me, I got your message. Although it is true that at the time I ran the Capture Data program there were some VMWare-related processes running (PID's 105, 106, 107 and 108), this was not the case when the issue occurred earlier. After initially experiencing the problem, this possibility had occurred to me so I took the time to terminate all VMWare processes using the activity monitor before again attempting to copy the files, including the processes mentioned by your engineering department. I documented this in my posting to apple's forum as follows: (quote is from my post of Feb 19, 2008, 1:28pm, to the thread "Unable to copy very large file to eSATA external HDD", relevant section in >bold print<)
    Thanks for the suggestions. I have since tried this operation with 3 different drives through two different interface types. Two of the drives are identical - 3.5" 7200 RPM 1TB Western Digital WD10EACS (WD Caviar SE16) in external hard drive enclosures, and the other is a smaller USB2 100GB Western Digital WD1200U0170-001 external drive. I tried the two 1TB drives through eSATA - ExpressPort and also over USB2. I have tried the 100GB drive only over USB2 since that is the only interface on the drive. In all cases the result is the same. All 3 drives are formatted Mac OS Extended (Journaled).
    I know the files work on my laptop's hard drive. They are a VMWare virtual machine that works just fine when I use it every day. >Before attempting the copy, I shut down VMWare and terminated all VMWare processes using the Activity Monitor for good measure.< I have tried the copy operation both through the finder and through the Unix command prompt using the drive's mount point of /Volumes/jfinney-ext-3.
    Any more ideas?
    Furthermore, to prove that there were no file locks present on the affected files, I moved them to a different location on my laptop's HDD and renamed them, which would not have been possible if there had been interference from vmware-related processes. So, that's not it.
    Your suggested workaround, to compress the files before copying them to the external drive, may serve as a temporary workaround but it is not a solution. This VM will grow over time to the point where even the compressed version is larger than the 42GB maximum, and compressing and uncompressing the files will take me a lot of time for files of this size. Could you please continue to pursue this issue and identify the underlying cause?
    Thank you,
    - Jeremy

  • Upload very large file to SharePoint Online

    Hi,
    I tried uploading very large files to SharePoint online via the REST API using JAVA. Uploading files with a size of ~160 mb works well, but if the filesize is between 400mb and 1,6 gb I either receive a Socket Exception (Connection reset by peer) or an error
    of SharePoint telling me "the security validation for the page is invalid". 
    So first of all I want to ask you, how to work around the security validation error. As far as I understand the token, which I have added to my X-RequestDigest header of the http post request seems to be expired (by default it expires after 1800 seconds).
    Uploading such large files is time consuming, so I can't change the token in my header while uploading the file. How can I extend the expiration time of the token (if it is possible)? Could it help to send post requests with the same token
    continuously while uploading the file and therefore prevent the token from expiration? Is there any other strategy to upload such large files which need much more than 1800 seconds for upload?
    Additionally any thoughts on the socket exception? It happens quite sporadically. So sometimes it happens after uploading 100 mb, and the other time I have already uploaded 1 gb.
    Thanks in advance!

    Hi,
    Thanks for the reply, the reason I'm looking into this is so users can migrate there files to SharePoint online/Office 365. The max file limit is 2gb, so thought I would try and cover this.
    I've looked into that link before, and when I try and use those end points ie StartUpload I get the following response
    {"error":{"code":"-1, System.NotImplementedException","message":{"lang":"en-US","value":"The
    method or operation is not implemented."}}}
    I can only presume is that these endpoints have not been implemented yet, even though the documentation says they only available for Office 365 which is what  I am using. 
    Also, the other strange thing is that I can actually upload a file of 512mb to the server as I can see it in the web UI. The problem I am experiencing is get a response, it hangs on this line until the TimeOut is reached.
    WebResponse wresp = wreq.GetResponse();
    I've tried flushing and closing the stream, sendchunked but still no success

  • Very large file upload 2 GB with Adobe Flash

    Hello, anyone know how can I upload very large files with Adobe Flash or how to use SWFUpload?
    Thanks in Advance, All help will be very appreciated.

    1. yes
    2. I'm getting error message from php
    if( $_FILES['Filedata']['error'] == 0 ){ 
      if(move_uploaded_file( $_FILES['Filedata']['tmp_name'], $uploads_dir.$_FILES['Filedata']['name'] ) ){ 
      echo 'ok'; 
      exit(); 
      echo 'error';    // Overhere
      exit();

  • How do I control read start position in a very large file where start byte position may be larger than I32 (+/- 2^31)?

    Using LabView, I am trying to read a very large file which may be on the order of 2^32 bytes. I need to be able to step into the file at a byte position which may be greater than the I32 limit set by the read file.vi. Are there any options to the read file.vi or a method of circumventing this limitation?

    I'm not sure but i think that you can manage the "pos mode" in the "seek" sub-vi.
    The "pos mode" let you choose the initial position to add the numbers of bytes you want to move.
    I think that you can add a i32 number from the "initial" in the "pos mode" an latter use the "pos mode" in "current" to add another value. Then the next time you can move more than 2^31 bytes to the initial position.
    I hope you understand my idea, i wasn't try it before, but i think that would work.

  • Web Service to transmit a file (a very large file)

    Greetings,
    Is is possible to use a web service to transmit a file, sometimes a very large file? If so, what are the security implications of doing such a thing?
    Thanks,
    Charles

    Quick answer: It depends... Did you try using Google to look-up examples of doing this? Here is a url to try: http://www.google.com/#sclient=psy&hl=en&source=hp&q=web+service+transfer+file&pbx=1&oq=web+service+transfer+file&aq=f&aqi=g1g-j3g-b1&aql=&gs_sm=e&gs_upl=1984731l1997518l0l1997874l25l19l0l1l1l0l304l3879l0.7.10.1l18l0&bav=on.2,or.r_gc.r_pw.&fp=7caaed5eb0414d97&biw=1280&bih=871
    Thank you,
    Tony Miller
    Webster, TX
    Never Surrender Dreams!
    JMS
    If this question is answered, please mark the thread as closed and assign points where earned..

  • Exporting to iTunes results in a very large file

    I have been making music on Garageband and when I export it to itunes, it becomes a very large file. This makes disables me from being able to post my music on my website or email it. What can I do to make the file smaller?
    imac G5   Mac OS X (10.4.5)   1.9 GHz PowerPC G5

    You should convert your file to an MP3 in iTunes:
    - Make sure that in the import options of iTunes, MP3 is selected.
    - Mark your song in iTunes and select "convert to MP3" in the menu.
    - This creates a new file in iTunes with the same title. If you don't know which is which, apple-I will tell you if it's an AIFF or an MP3, and the file size.

  • How do share a very large file?

    How do share a very large file?

    Do you want to send a GarageBand project or the bounced audio file?  To send an audio file is not critical, but if you want to send the project use "File > Compress" to create a .zip file of the project before you send it.
    If you have a Dropbox account,  I'd simply copy the file into the Dropbox "public" folder and mail the link. Right-click the file in the Dropbox, then choose Dropbox > Copy Public Link. This copies an Internet link to your file that you can paste anywhere: emails, instant messages, blogs, etc.
    2 GB on Dropbox are free.     https://www.dropbox.com/help/category/Sharing

  • I inadvertently created a very large file. Not needing it I sent it to the Trash Can. However, the deletion process never completes. Any suggestions as to how to delete it ?

    I inadvertently created a very large file on my hard drive. Not needing it I sent it to the Trash Can. However, the deletion process never completes. Any suggestions as to how to delete it ? I re-installed the OS but the file was still there taking up needed space.

    Command-click all of the files you want to delete and don't actually move them to the Trash until you've gone through the entire folder.
    (63341)

  • Handling a very larger file from Sender system

    Hi Sdn,
    Can you please tell me, what should be done from an SAP+XI side to handle a very large file coming from the sender system to be posted to the receiver? What should be done in XI end so that there is no problem in handling the larger file.
    Thanks and regards,
    Aniruddha Bhattacharya

    Hi Aniruddha,
    Go to below link and check point number 3.1.1 maintain parameters as mentioned in that point.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad

  • Sorting very large file

    Hi,
    I've a very large file (1.3G, more than 10 million records) that I need to sort, what is the best possible way to do that, without using a database?
    Thanks,
    Gary

    Just a suggestion:
    If you create a sorted red/black tree of record numbers (i.e. offsets in the data file), you could get an equivalent sort as follows (probably without a huge amount of memory...):
    Here's the strategy:
    Create a comparitor that compares two Long objects. It performs a seek in the data file and retrieves the records pointed to by those two Longs and compares them using whatever record comparitor you were going to use.
    Create a sorted list based on that comparitor (a tree based list will probably be best)
    Run a for loop going from 0 to the total records in the data file, create Long objects for each value, and insert that value into the list.
    When you are done, the list will contain the record numbers in sorted order.
    Create an iterator for the list
    For each element in the list, grab the corresponding data file record from the source file and write it to the destination file.
    voila
    Some optimizations may be possible (and highly desirable, actually):
    When you read records from the file, place them into an intermediate cache (linked list?) of limited length (say 1000 records). When a record is requested, scan the cache for that record ID - if found, move it to the top of the list, then return the associated data. If it is not found, read from the file, then add to the top of the list. If the list has more than X records, remove the last item from the list.
    The upper pieces of the sort tree are going to get hammered, so having a cache like this is a good idea.
    Another possibility: Instead of inserting the records in order (i.e. 0 through X), it may be desirable to insert them in pseudo-random order - I seem to remember that there was a really nice algorithm for doing this (without repeating any records) in an issue of Dr. Dobbs from awhile back. You need to get a string of random numbers between 0 and X without repeats (like dealing from a card deck).
    If your data is already in relatively sorted order, the cache idea will not provide significant bennefit without randomizing the inputs.
    One final note: Try the algorithm without the cache first and see how bad it is. The OSes disk caching may provide sufficient performance. If you encapsulate the data file (i.e. an object that has a getRecordBytes(int recNumber) method), then it will be easy to implement the cache later if it is needed.
    I hope this provides some food for thought!
    - K

  • Very large files checkin to DMS/content server

    Dear experts,
    We have DMS with content server up and running.
    However, there is a problem with very large files (up to 2GB!) - these files cannot be checked in due to extremely long upload times and maybe GUI limitations.
    Does anyone have suggestions how we could get very large files into the content server ? I only want to check them in (either via DMS or directly to the content server) and then get the URL back.
    best regards,
    Johannes

    Hi Johannes,
    unfortunatly there is a limit for files regarding the Content Server which is about 2GB. Please note that such large files will cause very long upload times. If possible I would recommend you to split the files to smaller parts and try to check it in.
    From my point of view it is not recommended to put files directly on the Content Server, because this could lead to inconsistencies on the Content Server.
    Best regards,
    Christoph

  • Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    I think more likely, it's a memory / scratch disk issue.  1.25 gigabytes is a very big image file!!
    Nancy O.

  • Tips or tools for handling very large file uploads and downloads?

    I am working on a site that has a document repository feature. The documents are stored as BLOBs in an Oracle database and for reasonably sized files its not problem to stream the files out directly from the database. For file uploads, I am using the Struts module to get them on disk and am then putting the blob in the database.
    We are now being asked to support very large files of 250MB+. I am concerned about problems I've heard of with HTTP not being reliable for files over 256MB. I'd also like a solution that would give the user a status bar and allow for restarts of broken uploads or downloads.
    Does anyone know of an off-the-shelf module that might help in this regard? I suspect an ActiveX control or Applet on the client side would be necessary. Freeware or Commercial software would be ok.
    Thanks in advance for any help/ideas.

    Hi. There is nothing wrong with HTTP handling 250MB+ files (per se).
    However, connections can get reset.
    Consider offering the files via FTP. Most FTP clients are good about resuming transfers.
    Or if you want to keep using HTTP, try supporting chunked encoding. Then a user can use something like 'GetRight' to auto resume HTTP downloads.
    Hope that helps,
    Peter
    http://rimuhosting.com - JBoss EJB/JSP hosting specialists

  • Very large file 1.6 GB..sports program for school. lots of photos, template where media spots saved and dropped photos in..trying to get it to printer swho says its too large but when reduces size loses all quality...HELP??

    Please help...large file my sons Athletic program. Had noto problem with Quality last year..this new printer said file is too large..too many layers, can I FLATTEN..He reduced size and photos look awful. It is 81 pages. Have to have complete before next friday!! Help anyone??

    At that size it sounds like you have inserted lots of photos at too large a size and resolution.
    Especially for a year book style project, first determine your image sizes and crop and fix the resolution to 300 dpi, then drag them into the publication. Pages does not use external references to files and rapidly grows to quite large size unless you trim and size your images before placement.
    Peter

Maybe you are looking for

  • How to verify an itunes account

    New Iphone.  ASking us to verify account.  But account it is directing us to with directions is an email account that does not exist anymore  How do we verify this account?

  • Seeburger FTP Adapter And SAP PI 7.1

    Hi Everyone, Just a few questions about Seeburger FTP adapter please. Bit of background for clarity, we are doing an implementation using Seeburger FTP and sap PI 7.1 picking up EDIFACT (Orders) and returning (DESADV, ORDERSP, INVOICES). However we a

  • Using Parallels and Blackberry on a Mac

    Hello all, I am running a new MacBookPro with OS 10.6.x and Parallels 5.0 to emulate Windows 7. I am trying to figure out if I can sync my Blackberry to my Outlook email, calendar and address book on the Windows side, as it is tied to my office Excha

  • Is it possible to Convert Java Forms in JHeadStart?

    Hi, I am new to JHeadstart, i am assigned to task. i have to Convert purchasing Module Java Forms to ADF in JHeadstart. I am wondering is it possible in JHeadStart if yes than how? Thanks in Advance Regards Sher ullah baig

  • What profile to assign for exported Photoshop from FCP6 frame?

    If I a single frame (ProRes 422 HQ) to a Photoshop file using FCP6, Photoshop complains of a missing profile. Should I assign sRGB, Apple RGB, or Adobe RGB? Is there a way to get FCP6 to embed a profile?