Server goes out of memory when annotating TIFF File. Help with Tiled Images

I am new to JAI and have a problem with the system going out of memory
Objective:
1)Load up a TIFF file (each approx 5- 8 MB when compressed with CCITT.6 compression)
2)Annotate image (consider it as a simple drawString with the Graphics2D object of the RenderedImage)
3)Send it to the servlet outputStream
Problem:
Server goes out of memory when 5 threads try to access it concurrently
Runtime conditions:
VM param set to -Xmx1024m
Observation
Writing the files takes a lot of time when compared to reading the files
Some more information
1)I need to do the annotating at a pre-defined specific positions on the images(ex: in the first quadrant, or may be in the second quadrant).
2)I know that using the TiledImage class its possible to load up a portion of the image and process it.
Things I need help with:
I do not know how to send the whole file back to servlet output stream after annotating a tile of the image.
If write the tiled image back to a file, or to the outputstream, it gives me only the portion of the tile I read in and watermarked, not the whole image file
I have attached the code I use when I load up the whole image
Could somebody please help with the TiledImage solution?
Thx
public void annotateFile(File file, String wText, OutputStream out, AnnotationParameter param) throws Throwable {
ImageReader imgReader = null;
ImageWriter imgWriter = null;
TiledImage in_image = null, out_image = null;
IIOMetadata metadata = null;
ImageOutputStream ios = null;
try {
Iterator readIter = ImageIO.getImageReadersBySuffix("tif");
imgReader = (ImageReader) readIter.next();
imgReader.setInput(ImageIO.createImageInputStream(file));
metadata = imgReader.getImageMetadata(0);
in_image = new TiledImage(JAI.create("fileload", file.getPath()), true);
System.out.println("Image Read!");
Annotater annotater = new Annotater(in_image);
out_image = annotater.annotate(wText, param);
Iterator writeIter = ImageIO.getImageWritersBySuffix("tif");
if (writeIter.hasNext()) {
imgWriter = (ImageWriter) writeIter.next();
ios = ImageIO.createImageOutputStream(out);
imgWriter.setOutput(ios);
ImageWriteParam iwparam = imgWriter.getDefaultWriteParam();
if (iwparam instanceof TIFFImageWriteParam) {
iwparam.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
TIFFDirectory dir = (TIFFDirectory) out_image.getProperty("tiff_directory");
double compressionParam = dir.getFieldAsDouble(BaselineTIFFTagSet.TAG_COMPRESSION);
setTIFFCompression(iwparam, (int) compressionParam);
else {
iwparam.setCompressionMode(ImageWriteParam.MODE_COPY_FROM_METADATA);
System.out.println("Trying to write Image ....");
imgWriter.write(null, new IIOImage(out_image, null, metadata), iwparam);
System.out.println("Image written....");
finally {
if (imgWriter != null)
imgWriter.dispose();
if (imgReader != null)
imgReader.dispose();
if (ios != null) {
ios.flush();
ios.close();
}

user8684061 wrote:
U are right, SGA is too large for my server.
I guess oracle set SGA automaticlly while i choose default installion , but ,why SGA would be so big? Is oracle not smart enough ?Default database configuration is going to reserve 40% of physical memory for SGA for an instance, which you as a user can always change. I don't see anything wrong with that to say Oracle is not smart.
If i don't disincrease SGA, but increase max-shm-memory, would it work?This needs support from the CPU architecture (32 bit or 64 bit) and the kernel as well. Read more about the huge pages.

Similar Messages

  • Out of memory when coverting large files using Web service call

    I'm running into an out of memory error on the LiveCycle server when converting a 50 meg Word document with a Web service call.  I've already tried increasing the heap size, but I'm at the limit for the 32 bit JVM on windows.  I could upgrade to a 64 bit JVM, but it would be a pain and I'm trying to avoid it.  I've tried converted the 50 meg document using the LiveCycle admin and it works fine, the issue only occurs when using a web service call.  I have a test client and the memory spikes when it's generating the web service call taking over a gig of memory.  I assume it takes a similar amount of memory on the receiving end which is why LiveCycle is running out of memory.  Does any one have any insight on why passing over a 50 meg file requires so much memory?   Is there anyway around this?
    -Kelly

    Hi,
    You are correct that a complete 64bit environment would solve this. The problem is that you will get the out of memory error when the file is written to memory on the server. You can solve this by creating an interface which stores large files on the server harddisk instead, which allows you to convert as large files as LC can handle without any memory issue.

  • Out of memory when i render HD with many layers

    I got a message error ( out of memory ) when i render HD full 1920 X 1080 in final cut pro 5 with many layers ( 6 to 9 layers ). I have 5.5 gig of memory and when is crash....1.5gig of memory is free in my activity monitor. If a buy more ram...my problem will be resolve? Please help me if anybody know about this problem.
    thank
    Yan

    I am having the same problem...(out of memory on renders)
    I've been watching my activity monitor and cancelling the render before I run out of memory... but I'm not getting much done and I can't do a big render at night.
    I've done all the usuall things... no CMYC files. Trash Prefrences... etc..
    Two questions:
    I am using Magic Bullet filters and they seem to stack up the inactive RAM. (When the incactive ram fills up that's when I get the "out of memeory") You said you saw a post about that a few days ago...Was there an answer besides changing out the filters? or can you point me towards the posting?
    I'm also trying to figure out how to dump or clear the inactive RAM once I'm done with a render. I found that if I do a full restart I can clear my RAM and get a bunch of renders done before I fill up the inactive ram again. This is going to be a long project if I have to do that.
    thanks
    Mark

  • SharePoint 2013 Search - Zip - Parser server ran out of memory - Processing this item failed because of a IFilter parser error

    Moving content databases from 2010 to 2013 August CU. Have 7 databases attached and ready to go, all the content is crawled successfully except zip files. Getting errors such as 
    Processing this item failed because of a IFilter parser error. ( Error parsing document 'http://sharepoint/file1.zip'. Error loading IFilter for extension '.zip' (Error code is 0x80CB4204). The function encountered an unknown error.; ; SearchID = 7A541F21-1CD3-4300-A95C-7E2A67B2563C
    Processing this item failed because the parser server ran out of memory. ( Error parsing document 'http://sharepoint/file2.zip'. Document failed to be processed. It probably crashed the server.; ; SearchID = 91B5D685-1C1A-4C43-9505-DA5414E40169 )
    SharePoint 2013 in a single instance out-of-the-box. Didn't install custom iFilters as 2013 supports zip. No other extensions have this issue. Range in file size from 60-90MB per zip. They contain mp3 files. I can download and unzip the file as needed. 
    Should I care that the index isn't being populated with these items since they contain no metadata? I am thinking I should just omit these from the crawl. 

    This issue came back up for me as my results aren't displaying since this data is not part of the search index.
    Curious if anyone knows of a way to increase the parser server memory in SharePoint 2013 search?
    http://sharepoint/materials-ca/HPSActiveCDs/Votrevieprofessionnelleetvotrecarrireenregistrement.zip
    Processing this item failed because the parser server ran out of memory. ( Error parsing document 'http://sharepoint/materials-ca/HPSActiveCDs/Votrevieprofessionnelleetvotrecarrireenregistrement.zip'. Document failed to be processed. It probably crashed the
    server.; ; SearchID = 097AE4B0-9EB0-4AEC-AECE-AEFA631D4AA6 )
    http://sharepoint/materials-ca/HPSActiveCDs/Travaillerauseindunequipemultignrationnelle.zip
    Processing this item failed because of a IFilter parser error. ( Error parsing document 'http://sharepoint/materials-ca/HPSActiveCDs/Travaillerauseindunequipemultignrationnelle.zip'. Error loading IFilter for extension '.zip' (Error code is 0x80CB4204). The
    function encountered an unknown error.; ; SearchID = 4A0C99B1-CF44-4C8B-A6FF-E42309F97B72 )

  • Ora-04030 process out of memory when trying to allocate nn bytes..

    Hi,
    Again we are facing the same error after increasing pga_aggregate_target from 2G to 3G, the journal import program request finally results in ora-04030 process out of memory when trying to allocate nn bytes.
    wiil there be any bugs issue?
    DBMS:9.2.0.3
    OS :sun sparc 5.10

    IF possible try to apply last patchset to take your DB to the latest version which should be more reliable and many bugs should be solved.
    Do you have free physical memory to use the 3gb?
    Hope this helps.
    Regards.
    FR.

  • MacBook Pro goes out of sleep when unplugging the the power cord

    When my MacBook Pro 13'' is in sleep-mode with the lid down and with the power cord plugged in,
    if you the unplug the power cord the computer goes out of sleep-mode.
    The same when it is in sleep-mode without the power cord, it goes out of sleep
    when the cord is replugged in, although the lid is still down.
    Anyone who have had the same problem?

    I'm having the same issue!  I took it to the Apple store, they tested it all out and none of the techs were able to fix it or heard of the issue before.
    They told me to reinstall OSx which I did and same issue.

  • When opening files "Not enough memory to load TIFF file."

    Mac just had a new build of OSX 10.6.8.
    Total memory 4gb of which over 3gb is free.
    Open any image gives a memory error "Not enough memory to load TIFF file"

    Open the linked tiff file(s) in photoshop, and resave with LZW Compression (OFF I Believe). I know you can place an lzw tiff adn usually works fine, but I believe i had this in the past and turned it off/chanegd the RLE settings and Illustrator was made happy. Also look for if the image is  photoshop >> image >> 8 bits/channel.

  • "General Error" and "Out of Memory" for only certain files?

    I have Final Cut Express 4 on Mac OS X Leopard and it has been working fine, up until now.
    For some reason, when I try to view two clips in the tab to the left of the screen to find sections of them to put into my project, I get a message that says, "General Error" and when I click okay it says, "Error: Out of Memory". They are both mp4 files, and are 242.9 MB and 294.2 MB, and I have viewed them both on Quicktime. I don't understand why it is saying that it is out of memory when I still have 7 GB left on my computer and it still lets me view and add other files into my sequence.
    Can someone help me out and tell me how to fix this? I'd really appreciate advise!

    MPEG-4 is not a format that works in FCE. You'll have to convert it to one of FCE's format. Without knowing details about what the original format is it's impossible to say what you should convert it to.

  • 3?'s: Message today warning lack of memory when using Word (files in Documents) something about "idisc not working" 2. Message week ago "Files not being backed up to Time Capsule"; 3. When using Mac Mail I'm prompted for password but none work TKS - J

    3 ?'s:
    1  Message today warning lack of memory when using Word (files in Documents) something about "idisc not working"
    2. Message week ago "Files not being backed up to Time Capsule";                                                                                                                                             
    3. When using Mac Mail I'm prompted for password but none work
    Thanks - J

    Thanks Allan for your quick response to my amateur questions.
    Allan:     I'm running version Mac OS X Version 10.6.8     PS Processor is 2.4 GHz Intel core 15 
    Memory  4 gb  1067   MHz  DDr3  TN And @ 1983-2011 Apple Inc.
    I just "Updated Software" as prompted.
    Thanks for helping me!    - John Garrett
    PS.
    Hardware Overview:
      Model Name:          MacBook Pro
      Model Identifier:          MacBookPro6,2
      Processor Name:          Intel Core i5
      Processor Speed:          2.4 GHz
      Number Of Processors:          1
      Total Number Of Cores:          2
      L2 Cache (per core):          256 KB
      L3 Cache:          3 MB
      Memory:          4 GB
      Processor Interconnect Speed:          4.8 GT/s
      Boot ROM Version:          MBP61.0057.B0C
      SMC Version (system):          1.58f17
      Serial Number (system):          W8*****AGU
      Hardware UUID:          *****
      Sudden Motion Sensor:
      State:          Enabled
    <Edited By Host>

  • I am having problems loading sampler patches into the exs24 sampler, when i got into edit menu and load multiple samples all the samples r greyed out and i cannot slect them, any help with this would be much appreciated

    I am having problems loading sampler patches into the exs24 sampler, when i got into edit menu and load multiple samples all the samples r greyed out and i cannot slect them, any help with this would be much appreciated

    It is very difficult to offer troubleshooting suggestions when the "os version" you are using is unknown as each os has their own troubleshooting solutions. 
    How large is your hard drive and how much hard drive space do you have left? 

  • Ipod Nano 6th generation setting issue - hi, when i am listening music and the screen goes to sleep my music stop playing.Please help with settings

    hi, when i am listening music and the screen goes to sleep my music stop playing.Please help with settings

    No problem.   And no worries, several people have made this same mistake!
    Either way enjoy the iPod!
    Brock

  • Multiclip Nightmare: General Error/Out of Memory when trying to match frame

    I'm an assistant editor on a reality show and am fairly experienced with Final Cut Pro. But I've never seen something like this.
    At our post house, we have over 30 Macs running FCP7, streaming media off a Xserve/Promise RAID SAN via Fiber channel. We create multiclips for everything (it's a reality show) and have run into some weird bugs.
    What's happening is that mutliclips will all-of-a-sudden be unable to match back to the original multiclip. It gives a General Error then an Out of Memory error. Sometimes, after those errors, FCP immediately quits. When I double click the clip in the timeline, I get the same general error. I can, however, match back to the original media clip. But matching back to the multiclip causes the error.
    I can't pinpoint a cause or common element. It's happening to different episodes with different editors and different project files.
    Also, randomly, things will go offline for no reason. And not everything in a timeline, just some things (which makes no since). We can reconnect them quickly since the media drive never went offline in the first place. I mention this because but I'm wondering if both of these problems are common symptoms of corrupt multiclips and if there's a way to fix them. The only element that I can find linking media randomly going offline is that these multiclips had subclips embedded in them. But even that doesn't seem to be a common thread.
    These timelines have been passed around from story editors to editors to AEs then back to editors, so we're thinking these multiclips might be getting corrupted through that process. I don't understand why because everything's looking at the server and all the Macs are pretty much identical.
    What I began to do was recut in each clip on the editor's timeline with the original multiclip. It fixes the problem but it's beyond time consuming to say the least. Now that these errors have come up three times in different places, I'm worried we have a widespread problem and if there's a better way to fix it.
    Details:
    We transcoded all the DVCPRO tapes with AJA Kona Cards to NTSC ProRes (LT) with plans to online the show once we lock
    Everyone's running the latest FCP7 and Snow Leopard release

    Nate Orloff wrote:
    It seems that FCP is making HUGE multiclips. I'm guessing that's the source of the memory error. Some multiclips (the ones it'll let me open) are 9 hours long.
    Interesting... Could be the source of your problem indeed
    And for no reason.
    Are you sure there's no reason? When making multilips there's not one clip's timecode passing 00:00:00:00 for exemple? Or a small clip is also selected with a complete different Timecode? Is it black all 9 hours long? etc.
    There must be something that is causing these long multiclips.
    I agree with Tom (down in this thread) that sending apple feedback with as much detail as possible might be helpfull.
    [FCP feedback|http://www.apple.com/feedback/finalcutpro.html]
    I would also be interested in an XML of one of the culprit clips. Not that I know answers, but looking at it might give some clue.
    Please select one of the wrong multiclips (in the browser). Select File > Export > XML. (version 5, include metadata; Please also make a version 4 XML).
    Send both XML to my email address (in my profile; click on my name in the left column of this post). No promises But you never know...
    Rienk

  • How can I avoid running out of memory when creating components dynamically

    Hello everyone,
    Recently, I am planning to design a web application. It will be used by all middle school teachers in a region to make examination papers and it must contain the following main functions.
    1)Generate test questions dynamically. For instance, a teacher who logs on the web application will only see a select one menu and a Next Quiz button. The former is used for determining the number of options for the current multiple/single choice question. The later is dedicated to creating appropriate input text elements according to the selected option number. That is to say, if the teacher selects 4 in the menu and presses the Next Quiz button, 5 input text form elements will appear. The first one is for the question to be asked such as "1.What is the biggest planet in the solar system?", the others are optional answers like a)Uranus. b) Saturn. c)Jupiter. d)Earch. Each answer stands for an input text elements. When the teacher fills in the fourth answer, another select one menu and Next Quiz button will emerge on the fly just under this answer, allowing the teacher to make the second question. The same thing repeats for the following questions.
    2)Undo and Redo. Whenever a teacher wants to roll back or redo what he has done, just press the Undo or[i] Redo button. In the previous example, if the teacher selects the third answer and presses the Delete button to drop this answer, it will delete both the literal string content[i] and the input text element, changing the answer d to c automatically. After that, he decides to get back the original answer c, Jupiter, he can just click the Undo button as if he hadn�ft made the deleting operation.
    3)Save the unfinished working in the client side. If a teacher has done half of his work, he can choose to press the Save button to store what he has done in his computer. The reason for doing so is simply to alleviate the burden of the server. Although all finished test papers must be saved in a database on the server, sometimes the unfinished papers could be dropped forever or could form the ultimate testing papers after several months. So if these papers keep in the server, it will waste the server computer�fs room. Next time the teacher can press the Restore button on the page to get the previously stored part of the test paper from his own computer and continue to finish the whole paper.
    4)Allow at least 1,000 teachers to make test papers at the same time. The maximum question number per examination paper is 60.
    Here are my two rough solutions,
    A.Using JSF.
    B.Using JavaScript and plain JSP[b] without JSF.
    The comparison of the two solutions:
    1)Both schemas can implement the first and the second requirements. In JSF page I could add a standard panelGird tag and use its binding attribute. In the backing bean, the method specified by the binding attribute is responsible for generating HtmlInput objects and adding them to the HtmlPanelGird object on the fly. Every HtmlInput object is corresponding to a question subject or an optional answer. The method is called by an actionListener, which is registered in the Next Quiz commandButton, triggering by the clicking on this button in the client side. Using JSF can also be prone to managing the HtmlInput objects, e.g. panelGird.getChildren().add(HtmlInput) and panelGird.getChildren().remove(HtmlInput) respond to the undoing operation of deleting an optional answer and the redoing operation of the deleting action respectively. I know JavaScript can also achieve these goals. It could be more complex since I don�ft know well about JavaScript.
    2)I can not find a way to meet the third demand right now. I am eager to know your suggestion.
    3)Using JSF, I think, can�ft allow 1,000 teachers to do their own papers at the same time. Because in this scenario, suppose each questionnaire having 60 questions and 4 answers per question, there will be approximately 300,000 HtmlInput objects (1,000X60X(4+1)) creating on the server side. The server must run out of memory undoubtedly. To make things better, we can use a custom component which can be rendered as a whole question including its all optional answers. That is to say, a new custom component on the server side stands for a whole question on the client side. Even so, about 60,000(1,000X60) this type of custom components will be created progressively and dynamically, plus other UISelectOne and UICommand objects, it also can�ft afford for most servers. Do I have to use JavaScript to avoid occupying the server's memory in this way? If so, I have to go back and use JavaScript and plain JSP without JSF.
    Thank you in advance!
    Best Regards, Ailsa
    2007/5/4

    Thank you for your quick response, BalusC. I really appreciate your answer.
    Yes, you are right. If I manually code the same amount of those components in the JSF pages instead of generating them dynamically, the server will still run out of memory. That is to say, JSF pages might not accommodate a great deal of concurrent visiting. If I upgrade the server to just allow 1,000 teachers making their own test papers at the same time, but when over 2,000 students take the same questionnaire simultaneously, the server will need another upgrading. So I have to do what you have told me, using JS+DOM instead of upgrading the server endlessly.
    Best Regards, Ailsa

  • "Error: Out of Memory" when trying to Render

    Hello all,
    I'm editing a project in FCP 5 that is using, in part, .mpeg-4 videos downloaded from the internet. After several weeks on the project with things going fine, the downloaded videos can no longer render. I can see them in the source window, but when laid into the sequence these downloaded videos produce an "Error: Out of Memory" message. Other videos of this type that are already rendered and playing in the existing sequence cannot be re-rendered or adjusted without producing the same Error message.
    A few things to remember...
    1. Other types of rendering, such as filters and fit to fill effects, render just fine.
    2. There is over 30 GBs on the external drive this project is running on.
    3. I've tried resetting scratch disks, transferring the project to another drive, and opening it on a new, super-fast computer, and nothing is different.
    Please help, I'm really stuck in the mud here and am dying to get past this issue,
    Peace,
    Sam

    Did either of you find an answer? I'm running into the S A M E issues. I feel strongly that it has to do with the quicktime update, as editing mp4's worked FINE in January.
    There is a lengthy protocol for trying to circumvent this issue:
    http://discussions.apple.com/thread.jspa?threadID=1413028&tstart=0
    Didn't work for me, but I think it worked for the query. And VLC (http://www.videolan.org/) ended up being the only program I can even VIEW mp4 in now properly.
    So if you can use a viewer, that works.

  • "File Error" and "Out of Memory" when rendering FCE

    Dear all,
    I imported a 2 hour holiday video into FCE.
    When rendering the video, a message appears that the rendering process will last about 15 minutes to complete.
    However, frequently I am warned by the following message:
    "File Error: The specified file is open and in use by this or another application"
    When activating Rendering again, the render time is now increased to about 45 minutes.
    I now either receive a couple of times the message: "File Error: The specified file is open and in use by this or another application" >>or even worse: "Out of memory"<<.
    Today I purchased an addition 2GB of memory to increase my memory form 750 MB to 2.5GB !!!
    Can anyone please tell me what could be the cause of these messages and how to solve them?
    BTW, no other programs are running while I use FCE.
    Thanks in advance,
    Hans E.<br>
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express Mac OS X (10.3.9)
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express Mac OS X (10.3.9)
    PowerMac G5-Dual 1.8GHz, Memory 2.5GB, iMac G3-600MHz, Airport, Airport Express   Mac OS X (10.3.9)  

    Is it happening when you're rendering, or exporting?
    The error message means FCE is trying to rewrite a file that it is currently using. It could be mistakenly trying to save a render file, if you're only rendering, or if you're trying to re-export a file, you'll get that message.
    Try dumping all your render files, restarting FCE and trying again.
    The Out of Memory error almost always points toward a corrupt file. Could be a source file, could be a render file. Again, dump your render files and try again.
    If these don't work, you need to close FCE, take all your source media offline, then open FCE. Now, reconnect each clip one by one until you find the corrupt clip. Keep going, there may be more than one. Re-capture the corrupt clips and you should be good to go.

Maybe you are looking for

  • UNABLE TO INSTAL ADOBE 9.2.0

    Why is it that I am unable to instal the adobe update 9.2.0 on my laptop which operates on a windows xp OS?Even after 10 minutes,it shows 'extracting files'.Is this normal because I get a feeling that my laptop is hung?How long does this update take

  • Infinity and Ipad incompatability?

    Hi folks Some months ago, I upgraded my system to Infinity. After some initial problems ,it is generally superb giving download speeds of up to 70/80 mps with upload speeds of around 17mps. I use an Imac and a Samsung 10 tablet without any issues of

  • Multiple Kiosks with One Mac

    I'm looking for a solution to use one Mac to run multiple kiosk screens. On each screen the user will be accessing a local web page to view content. I'd like to run multiple screens per computer rather than having to have one machine per screen since

  • Truncate and populate temp tables

    Hi - I need to convert oracle reportwriter reports that call procedures to truncate/populate tables and return REF CURSORs. Can I call these same procedures from BI Publisher ? Thank-you.

  • BI complete documentation

    Dear all, I am new to BI/BW - can anyone explain the complete process of BI/BW with example and documentation for me to understand the configuration ? Regards, M.M