Problem with large spool lists

Hello everyone,
we have the following problem with one of our printers. When we are trying to print larger lists (>30 pages) the spool shows an "unknown error" and only a few pages are printed out.
We have the following setup. A Kyocera FS-9520DN printer with Device Type KYOAAB1C which is connected via saprouter connection and host spool access method U. The only error I get from the tracefile of the Spool WP is
S Tue Feb 22 07:34:37 2011
S  *** ERROR => cannot write to connection [rspoclpd.c   857]
S  *** ERROR => Lost connection, control file not sent [rspoyunx.c   1754]
T  new memory block 000000002B6D6CE0
S  handle memory type is RSTSPROMMM
and the SP01 error message
Status:              Incorrect (Reason unknown)
Job status:          Completed
Area:                Unknown
Error:               Reason unknown
Last event:
Message:             Processing completed by spool work process
Any suggestions what might improve the situation? Thanks in advance
Marco

Hi,
A direct connection to a remote printer is not recommended as per the SAP Note #64628.
The best option is to define the printer locally on the SAP server with Access method L for Unix or Access Method C for windows. If you need to use remote Access Method U or S, you should use a print server in between the SAP system and the printers.
Regards,
Aidan

Similar Messages

  • Out.println() problems with large amount of data in jsp page

    I have this kind of code in my jsp page:
    out.clearBuffer();
    out.println(myText); // size of myText is about 300 kbThe problem is that I manage to print the whole text only sometimes. Very often happens such that the receiving page gets only the first 40 kb and then the printing stops.
    I have made such tests that I split the myText to smaller parts and out.print() them one by one:
    Vector texts = splitTextToSmallerParts(myText);
    for(int i = 0; i < texts.size(); i++) {
      out.print(text.get(i));
      out.flush();
    }This produces the same kind of result. Sometimes all parts are printed but mostly only the first parts.
    I have tried to increase the buffer size but neither that makes the printing reliable. Also I have tried with autoFlush="false" so that I flush before the buffer size gets overflowed; again same result, sometimes works sometimes don't.
    Originally I use such a system where Visual Basic in Excel calls a jsp page. However, I don't think that this matters since the same problems occur if I use a browser.
    If anyone knows something about problems with large jsp pages, I would appreciate that.

    Well, there are many ways you could do this, but it depends on what you are looking for.
    For instance, generating an Excel Spreadsheet could be quite easy:
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.io.*;
    public class TableTest extends HttpServlet{
         public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException {
              response.setContentType("application/xls");
              PrintWriter out = new PrintWriter(response.getOutputStream());
                    out.println("Col1\tCol2\tCol3\tCol4");
                    out.println("1\t2\t3\t4");
                    out.println("3\t1\t5\t7");
                    out.println("2\t9\t3\t3");
              out.flush();
              out.close();
    }Just try this simple code, it works just fine... I used the same approach to generate a report of 30000 rows and 40 cols (more or less 5MB), so it should do the job for you.
    Regards

  • Problem with printing ALV lists

    Hey Guys,
    I have a problem with printing ALV lists ,
    I created a report with several ALV lists (not grids) on the same screen but when i attempt to print the report
    it prints each alv list on a different page..so if i have 3 alv lists in the same report it will print the report on 3 pages
    How can i print  them all in one page?
    Thanks in advance
    Noha Salah.

    Hey Max,
    I tried setting the Layout-list_append  before my block_list_append function call
    And setting the is_print-NO_NEW_PAGE , it printed the 3 lists on one page the only problem i have
    is that the lists are truncated and the list formats has totally been messed up..how can i restore them back
    to their original format?

  • Apex 4.1 - problem with navigation component LIST after upgrade

    There seems to be problem with Tags in LIST component, after upgrade from 4.0 to 4.1 release

    Right, we verified a problem for migrated lists.
    We have lists of links, that are not based on a specific template. After upgrade to 4.1, they stopped from rendering in their region. Found out that the tag <TABLE> was not included in the HTML Source, hence the list body was not built to render.
    The only way we found to include it was to create a template for Lists (there is not one by default in Apex in certain themes), edit it and include the tag (in the section "Before List Entry" > "List Template Before Rows" - the tag itself is not present in the newly created template, too). Then, you need to assign the template to the existing Lists to make the list body re-appear: it doesn't seem to be a good practice to create a list without a template, but Apex should at least provide 1 by default.
    Edited by: Kleber M on Oct 28, 2011 9:03 AM

  • Problems with large scanned images

    I have been giving Aperture another try since 1.1 came out, and I am still having problems with large tiff files derived from scanned 4x5 negatives. The files are 500mb or more, 16 bit RGB, with ProPhoto RGB or Ektaspace PS5 profiles, directly out of the scanner.
    Aperture imports the files correctly, and shows their thumbnails. When I select a thumbnail "Loading" is displayed briefly, and the the dreaded "Unsupported Image Format" is displayed. Sometimes "Loading" goes on for a while, and a geometric pattern (looking like a rendering of random memory) is displayed. Restarting Aperture doesn't help.
    Lower resolution (250mb, 16bit) files are handled properly. The scans are from an Epson 4870 scanner. I have tried pulling the scans into Photoshop and resaving with various tiff options, and as PSD with no improvement. I have the same problem with corrected/modified psd files coming out of Photoshop CS2.
    I am running on a Power Mac G5 dual 2ghz with 8gb of RAM and an NVIDIA GeForce 6800 GT DDL (250mb) video card, with all the latest OS and software updates.
    Has anyone else had similar problems? More importantly, is anyone else able to work with 500mb files of any kind? Is it my system, or is it the software? I sent feedback to Apple as well.
    dual g5 2ghz   Mac OS X (10.4.6)  

    I have a few (well actually about 100) scans on my system of >500Mb. I tried loading a few and am getting an inconsistent pattern of errors that correlates with what you are reporting.
    I imported 4 files and three were troubled, the fouth was OK. I imported another four files and the first one was OK and the three others had your reported error, also the previously good file from the first import was now showing the same 'unsupported' image' message.
    I would venture to say that if you shoot primarily 4x5 and work with scans of this size that Aperture is not the program for you--right now. I shoot 35mm and have a few images that I have scanned at 8000dpi on my Imacon 848 but most of my files are in the more reasonable 250Mb range (35mm @ 5000dpi).
    I will probably downsample my 8000dpi scans to 5000dpi and not worry to much about it. In a world where people believe that 16 megapixels is hi-res you are obviously on the extreme side.(Good for you!) You should definately file a bug report but I wouldn't expect much help anytime soon for your super-sized scans.

  • RESOLVED - K8N Neo2 Platinum problem with large Seagate disk

    Hi All,
    I've been running this system for 4 years and I'm very happy with it. But I've run into a problem:
    I'm trying to exchange my 120 GB Seagate Barracuda systemdisk with a 500 GB Barracuda and I can't install XP with SP2.
    First I tried to clone the old HD with Norton Ghost and Acronis True Image. The cloning looked fine but after taking the old HD out and moving the new one to P-ATA 1 as Primary Master I get an "error - no operating system found". I also tried a fresh install from CD. After the CD reboots the system I get the same error.
    The HD is present in BIOS with the right parameters and all is looking well - but still no go.
    I've fiddled around in BIOS (vers. 1.3) with the different settings (LBA, Large, Auto, Csh.) The last 2 options gives the "no operating system" error and the first 2 gives a "disk error" although Seagate says that I should change settings to "LBA".
    I only have the new HD in the machine. I also tried fixboot and fixmbr. I can see all the installed files when I put it in as a slave.
    So my question is:
    Does the K8N Neo2 Platinum have a problem with large hd's as system disk?
    Or is this a Seagate or Windows problem ?
    Any help is much appreciated!
    Mads

    Actually that was the first thing I did just to see if the drive worked :-) - except that I had it as master on the second ide cable. It formats like it should and all partitions report "healthy". The only difference is that the new drives c-partition is set as active (can't find a way to change it back)  - while the old one is set as system. Could this make a difference ?
    Since neither cloning nor a fresh install works I think the problem is something else. Maybe I should try Seatools and see if that makes any difference.
    Thank's all for help so far! - Other suggestions ?

  • Problem with archiving print lists

    Hi all,
    we have an issue with asynchronous archive requests.
    The archiving of some spool lists fails. Here the error message in OAM1:
    Order History
    12.03.2010 19:36:10     Request Created
    12.03.2010 19:36:10     Asynchronous request started; awaiting confirmation
    12.03.2010 19:36:10     Content Repository Reports Error
    JobLog:
    Job started
    Step 001 started (program RSCMSCAJ, variant &0000000023175, user ID XXXXXX)
    Error occurred during character conversion
    SAP ArchiveLink (error in storage system)
    Job cancelled after system exception ERROR_MESSAGE
    It gets weird when I reprocess the request in OAM1, because then it works and we have no error message. It feels like when we start 50+ requests, some jobs fail. And if we start archiving of <10 requests everything works fine.
    We tried archiving of 800+ spool lists in our QAS system with no problem. So we suspected, that our PRD content server may be broken or something. So we tried archiving spool lists from our SAP PRD system to our QAS content server -> same issue. The only thing left would be missconfiguration in our SAP PRD system but a few weeks ago everything worked fine and we had no issues.
    Have somebody faced similar issues?
    Regards
    Davor

    hi,
    Sorry i have to answer with questions:
    are you archiving spools larger then 2 gig?
    can you send use the logs from the content server?
    can you check OAM1 if there is more info ( check logging)
    Rgds
    Nico

  • Problems with EC Sales List in PDF-Form

    Hi,
    i have a Problem with the EC Sales List (RFASLM00).
    When i use it with SAP script Form Output (F_ASL_DE) i will be asked
    for a Printer and it works. I get an Spool-No.
    When i use it with PDF Form Output (F_ASL_DE) i will be asked for a Printer
    (i don't use LOCAL). When finished  i get the message:
    Spool number 0000000000 LIST2S - EC sales list for 0888.
    No spool is create and i don't know where i can find the output. 
    Hope anyone can help.
    Regards, Dieter

    no need to take output of PDF.
    do one thing
    run report and generate spool for LOCL
    then
    <b>RSTXPDFT4</b> run this report with that spool number then u will get output in PDF format.
    Regards
    Prabhu

  • Problem with  large databases.

    Lightroom doesn't seem to like large databases.
    I am playing catch-up using Lightroom to enter keywords to all my past photos. I have about 150K photos spread over four drives.
    Even placing a separate database on each hard drive is causing problems.
    The program crashes when importing large numbers of photos from several folders. (I do not ask it to render previews.) If I relaunch the program, and try the import again, Lightroom adds about 500 more photos and then crashes, or freezes again.
    I may have to go back and import them one folder at a time, or use iView instead.
    This is a deal-breaker for me.
    I also note that it takes several minutes after opening a databese before the HD activity light stops flashing.
    I am using XP on a dual core machine with, 3Gigs of RAM
    Anyone else finding this?
    What is you work-around?

    Christopher,
    True, but given the number of posts where users have had similar problems ingesting images into LR--where LR runs without crashes and further trouble once the images are in--the probative evidence points to some LR problem ingesting large numbers.
    I may also be that users are attempting to use LR for editing during the ingestion of large numbers--I found that I simply could not do that without a crash occuring. When I limited it to 2k at a time--leaving my hands off the keyboard-- while the import occured, everything went without a hitch.
    However, as previously pointed out, it shouldn't require that--none of my other DAMs using SQLite do that, and I can multitask while they are ingesting.
    But, you are right--multiple single causes--and complexly interrated multiple causes--could account for it on a given configuration.

  • Problem with Business Partner List - State

    I'm using SAP B1 2007A. We recently added several International countries and their regions (states). We are having a problem with the state that appears on the Business Partner List (list that appears when you search using part of a company name or *). The query that gets the data does not take into consideration the Country where the state is located. The result is that the query returns the first state it finds based on the state code not the state associated with the country.
    Is there a fix for this query or is there a way I can adjust the query to get the correct state results in the Business Partner List.
    Thanks,
    Mark

    No. Different state name but same state code.
    Ex:
    GA - Georgia in USA
    GA - Galapagos Islands in Galapagos Islands
    When searching for the Business Partner the city is Atlanta the Country is USA but the State is Galapagos Islands not Georgia.

  • Help me please : Serious problems with collection-mapping, list-mapping and map-mappi

    Hi everybody;
    I have serious problems with list-mapping, collection-mapping and map-mapping.
    Acording to specifications and requirements in a system I am working on it is needed to
    get a "list" of values or an indivudual value.I am working with ORACLE 9i Database,
    ORACLE 9i AS and ORACLE 9i JDEVELOPER.
    I tried to map a master-detail relationship in an entity-bean, using list-mapping.
    And this was very useful in order to get a "list" of details, ...but, when I wanted
    to get a single value I have some problems with persistence, something about "saving a state"
    despite I just want to get the value of a single detail.
    I decided to change it to map-mapping and the problem related with a single detail
    worked successfully, but I can get access to the whole bunch of details.
    May anyone of you help me with that?
    I am very confused I do not know what to do.
    Have any of you a solution for that problem?
    Thank you very much.

    Have you tried a restore in iTunes?

  • Problem with Billing due list, relating to Returns Credits

    Hi
    I have a query/ problem with the billing due list
    I cannot seem to be able to create billing documents via VF04 for Returns credits, only via VF01
    I have changed Billing relevance of Item category REN to be A - Delivery related,
    As we previously had problems with incorrect qty being billed IF returned delivery quantity was different to Returns order quantity   - this resolved the problem but I now cannot create billing via billing due list
    Messsage appears ' delivery type LR cannot be invoiced with billing type F2'
    If i enter billing type RE then the document is not gettting picked up at all
    when Billing relevance of item category was B, order related - all seemed to be working ok
    I have copy controls set up from LR to RE with item category REN, with pricing type G
    Please advise how i can get the Returns to appear in the billing due list
    many thanks for your help
    Tony

    Hi,
    In copy controll VTFL
    LR to F2 at item level
    REN >>> Deatila >>> Put billing quantity as - D delivery related
    Kapil

  • Problem with large data report

    I tried to run a template I got from release 12 using data from the release we are using (11i). The xml file is about 13,500 kb. when i run it from my desktop.
    I get the following error (mostly no output is generated sometimes its generated after a long time).
    Font Dir: C:\Program Files\Oracle\BI Publisher\BI Publisher Desktop\Template Builder for Word\fonts
    Run XDO Start
    RTFProcessor setLocale: en-us
    FOProcessor setData: C:\Documents and Settings\skiran\Desktop\working\2648119.xml
    FOProcessor setLocale: en-us
    I assumed there may be compatibility issues between 12i and 11i hence tried to write my own template and ran into same issue
    when i added the third nested loop.
    I also noticed javaws.exe runs in the background hogging a lot of memory. I am using Bi version 5.6.3
    I tried to run the template through template viewer. The process never completes.
    The log file is
    [010109_121009828][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setData(InputStream) is called.
    [010109_121014796][][STATEMENT] Logger.init(): *** DEBUG MODE IS OFF. ***
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setTemplate(InputStream)is called.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setOutput(OutputStream)is called.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setOutputFormat(byte)is called with ID=1.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.setLocale is called with 'en-US'.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.process() is called.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] FOProcessor.generate() called.
    [010109_121014796][oracle.apps.xdo.template.FOProcessor][STATEMENT] createFO(Object, Object) is called.
    [010109_121318828][oracle.apps.xdo.common.xml.XSLT10gR1][STATEMENT] oracle.xdo Developers Kit 10.1.0.5.0 - Production
    [010109_121318828][oracle.apps.xdo.common.xml.XSLT10gR1][STATEMENT] Scalable Feature Disabled
    End of Process.
    Time: 436.906 sec.
    FO Formatting failed.
    I cant seem to figure out whether this is a looping or large data or BI version issue. Please advice
    Thank you

    The report will probably fail in a production environment if you don't have enough heap. 13 megs is a big xml file for the parsers to handle, it will probably crush the opp. The whole document has to be loaded into memory and perserving the relationships in the documents is probably whats killing your performance. The opp or foprocessor is not using the sax parser like the bursting engine does. I would suggest setting a maximum range on the amount of documents that can be created and submit in a set of batches. That will reduce your xml file size and performance will increase.
    An alternative to the pervious approach would be to write a concurrent program that merges the pdfs using the document merger api. This would allow you to burst the document into a temp directory and then re-assimilate them it one document. One disadvantage of this approach is that the pdf is going to be freakin huge. Also, if you have to send that piggy to the printer your gonna have some problems too. When you convert it pdf to ps the files are going to be massive because of the loss of compression, it's gets even worse if the pdf has images......Then'll you have a more problems with disk on the server and or running out of memory on ps printers.
    All of things I have discussed I have done in some sort of fashion. Speaking from experience your idea of 13 meg xml file is just a really bad idea. I would go with option one.
    Ike Wiggins
    http://bipublisher.blogspot.com

  • Problems with customizing select lists and popup LOVs

    Hi
    I have 2 problems about select lists and popup LOVs.
    The first one is about a select list in a tabular form.
    It should be created with APEX_ITEM.SELECT_LIST_FROM_LOV or similar and take its values from a named LOV.
    This worked fine but now it should also have the possibility to enter a free value.
    I tried to accomplish that by creating a APEX_ITEM.POPUP_FROM_LOV, but there is a problem with the function that is called by the arrow icon right to the input field (for eg. genList_f11_5()).
    If the row is added by addRow, then it works fine, but if the row is is not empty
    then the function call is like genList_f11_$_row() and the input field gets no value, when a LOV option is selected.
    The other problem is about a select list which should have the possibility to enter a custom value and
    also there should be the possibility to select several values. I tried to implement this by a text area containing the selected values and a multiple select list, with an event handler in each option. The user could click options and they would be copied to the text area. The problem is that I couldn't make the event handler work in IE.
    I would appreciate any ideas about either of these problems.
    Tiina

    Hi,
    If you download application you can see source.
    I have not write any instructions, sorry.
    If you are on Apex 4 you can just load jQuery UI autocomplete library and take ideas from my app.
    If you download my sample in zip there is uncompressed htmldbQuery library.
    You can see that and take only function htmldbAutocomplete.
    Then check jQuery UI document
    http://jqueryui.com/demos/autocomplete/#method-search
    There is method search that you can use open list just by click of input.
    I hope this helps at start.
    Regards,
    Jari

  • IPhoto performance problems with large library

    I recently purchased a new MacBook Pro Retina (2013 model) with top-of-line options. The hardware specs are 2.6Ghz Core i7 (8 cores), 16 GB RAM, 1 TB SSD, and OS X 10.9.1 (Mavericks).
    The machine came with iPhoto 11 (v9.5.1). I successfully ported my iPhoto library from my old MacBook Pro to the new machine. The iPhoto library is about 550 GB on disk, and consists of about 2000 Events, 95000 Photos, and about 50 Albums & Smart Albums. The images in the library are a mix of JPEG, RAW and other content types such as MOV and AVI etc.
    I have measured the performance of the internal SSD with tools like Black Magic and seen read & write speeds in excess of 850 MB/sec (obviously sequential). This is the fastest disk performance of any machine I've ever worked on, so I question whether iPhoto is slow because of this.
    Despite the formidable hardware resources of this machine (proven running other workloads), iPhoto still behaves like a dog with my library. Opening iPhoto takes about 10 seconds to get to the first screen. Deleting a bunch of photos will result in a spinning disc for about 10-15 seconds while the images move to the Trash. Emptying the trash with 500-1000 photos takes about 15 minutes. Reorganizing events can take up to 30 seconds. Even browsing from photo to photo incurs lags of several seconds. This lack of response time makes the sw frustrating if not impossible to use efficiently.
    During all these delays I have monitored system perfromance with various tools like Activity Monitor and iStat Menus. The odd thing is that when iPhoto is spinning its disc I am not seeing any kind of system resource bottleneck. In other worlds, CPU is barely utilized on any of the 8 cores, disk IO is minimal, and there is zero network IO. To ensure maximum RAM utilization, I run iPhoto after a fresh reboot and do not launch any other apps while it's running. For the life of me I cannot imagine what iPhoto is doing.
    I have also tried several of the library recover / rebuild options as well, but no problems were reported with the library and performance did not improve.
    I am considering upgrading to Aperture, but I wonder if both these solutions have serious design limitations when working on large libraries, which result in significant performance issues.
    I would be interested in learning what experience other users have when working on iPhoto libraries of similar size.
    Summary:
    iPhoto library size on disk: 550 GB
    Events: approx 2,000
    Photos: approx 95,000
    Albums & Smart Albums: approx 100
    Regards,
    Nico

    Hi Léonie, All the sw is at latest update. The library was simply copied from the old machine to the new machine. When I launched the new version of iPhoto (on the new machine) I pointed it to the copied iPhoto library and it picked it up without any issue. The first time I opened it, iPhoto spent some time migrating the data format from the old iPhoto version to the new, but that was done only once. I am not sharing the library between the old iPhoto and new; now the migrated library is for the exclusive use of the new iPhoto.
    Regarding error messages in syslog, it's quite funny, I do see them, and they reflect the fact that iPhoto has gone off to la-la land for extended periods of time, ignoring all user input (this is what results in the spinning disc in Mac OS):
    1/28/14 9:16:18.792 PM WindowServer[127]: disable_update_timeout: UI updates were forcibly disabled by application "iPhoto" for over 1.00 seconds. Server has re-enabled them.
    1/28/14 9:16:20.926 PM WindowServer[127]: common_reenable_update: UI updates were finally reenabled by application "iPhoto" after 3.13 seconds (server forcibly re-enabled them after 1.00 seconds)
    So that was a common 3 second "hang" of iPhoto. Nothing apart from those kinds of messages.
    Thanks for the links to iStat... i do in fact have the latest version of it and don't have that problem with Mavericks.

Maybe you are looking for