Handling large image files...

Hello All,
Few days back I posted a query regarding handling of Satellite Imagery but didn't got any responses...
Well, what I'll try to do now that I'll post the same question in adifferent way.
In my current project, we require textures of around 1024*1024 pixels and around 15 to 20 in numbers to be simulataneously displayed in the scenegraph. We are using a good Hardware Accelerator Card with sufficient memory ..... but the overall performance degrades a lot.
Any suggestions regarding handling these images (or some sort of dynamic loading)
Regards

Hello,
you should use Mipmaps.
http://java.sun.com/products/java-media/3D/collateral/
Chapter 7.6
Regards,
Oliver

Similar Messages

  • How to increase performance speed of Photoshop CS6 v13.0.6 with trasformations in LARGE image files (25,000 X 50,000 pixels) on IMac3.4 GHz Intel Core i7, 16 GB memory, Mac OS 10.7.5?   Should I purchase a MacPro?

    I have hundreds of these large image files to process.  I frequently use SKEW, WARP, IMAGE ROTATION features in Photoshop.  I process the files ONE AT A TIME and have only Photoshop running on my IMac.  Most of the time I am watching SLOW progress bars to complete the transformations.
    I have allocated as much memory as I have (about 14GB) exclusively to Photoshop using the performance preferences panel.  Does anyone have a trick for speeding up the processing of these files--or is my only solution to buy a faster Mac--like the new Mac Pro?

    I have hundreds of these large image files to process.  I frequently use SKEW, WARP, IMAGE ROTATION features in Photoshop.  I process the files ONE AT A TIME and have only Photoshop running on my IMac.  Most of the time I am watching SLOW progress bars to complete the transformations.
    I have allocated as much memory as I have (about 14GB) exclusively to Photoshop using the performance preferences panel.  Does anyone have a trick for speeding up the processing of these files--or is my only solution to buy a faster Mac--like the new Mac Pro?

  • Flex file upload issue with large image files

         Hello, I have created a sample flex application to upload an image and also created java servlet to upload and save image and deployed in local tomcat server. I am testing the application in LAN. I am able to upload small as well as large image file(1Mb) from some PCs but in some other PCs I am getting IOError while uploading large image files however it is working fine for small images. Image uploading is hanging after 10%-20% and throwing IOError. *Surprizgly it is working Ok with XP systems and causeing issues with Windows7 systems*.
    Plz give me any idea to get a solution.
    In Tomcat server side it is giving following error:
    request: org.apache.catalina.connector.RequestFacade@c19694
    org.apache.commons.fileupload.FileUploadBase$IOFileUploadException: Processing of multipart/form-data request failed. Stream ended unexpectedly
            at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:371)
            at org.apache.commons.fileupload.servlet.ServletFileUpload.parseRequest(ServletFileUpload.ja va:126)
            at flex.servlets.UploadImage.doPost(UploadImage.java:47)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
            at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.j ava:290)
            at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
            at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
            at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
            at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
            at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
            at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
            at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
            at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:877)
            at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProto col.java:594)
            at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1675)
            at java.lang.Thread.run(Thread.java:722)
    Caused by: org.apache.commons.fileupload.MultipartStream$MalformedStreamException: Stream ended unexpectedly
            at org.apache.commons.fileupload.MultipartStream$ItemInputStream.makeAvailable(MultipartStre am.java:982)
            at org.apache.commons.fileupload.MultipartStream$ItemInputStream.read(MultipartStream.java:8 86)
            at java.io.InputStream.read(InputStream.java:101)
            at org.apache.commons.fileupload.util.Streams.copy(Streams.java:96)
            at org.apache.commons.fileupload.util.Streams.copy(Streams.java:66)
            at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:366)
    UploadImage.java:
    package flex.servlets;
    import java.io.*;
    import java.sql.*;
    import java.util.*;
    import java.text.*;
    import java.util.regex.*;
    import org.apache.commons.fileupload.servlet.ServletFileUpload;
    import org.apache.commons.fileupload.disk.DiskFileItemFactory;
    import org.apache.commons.fileupload.*;
    import sun.reflect.ReflectionFactory.GetReflectionFactoryAction;
    import javax.servlet.*;
    import javax.servlet.http.*;
    public class UploadImage extends HttpServlet{
             * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse
             *      response)
            protected void doGet(HttpServletRequest request,
                            HttpServletResponse response) throws ServletException, IOException {
                    // TODO Auto-generated method stub
                    doPost(request, response);
            public void doPost(HttpServletRequest request,
                            HttpServletResponse response)
            throws ServletException, IOException {
                    PrintWriter out = response.getWriter();
                    boolean isMultipart = ServletFileUpload.isMultipartContent(
                                    request);
                    System.out.println("request: "+request);
                    if (!isMultipart) {
                            System.out.println("File Not Uploaded");
                    } else {
                            FileItemFactory factory = new DiskFileItemFactory();
                            ServletFileUpload upload = new ServletFileUpload(factory);
                            List items = null;
                            try {
                                    items = upload.parseRequest(request);
                                    System.out.println("items: "+items);
                            } catch (FileUploadException e) {
                                    e.printStackTrace();
                            Iterator itr = items.iterator();
                            while (itr.hasNext()) {
                                    FileItem item = (FileItem) itr.next();
                                    if (item.isFormField()){
                                            String name = item.getFieldName();
                                            System.out.println("name: "+name);
                                            String value = item.getString();
                                            System.out.println("value: "+value);
                                    } else {
                                            try {
                                                    String itemName = item.getName();
                                                    Random generator = new Random();
                                                    int r = Math.abs(generator.nextInt());
                                                    String reg = "[.*]";
                                                    String replacingtext = "";
                                                    System.out.println("Text before replacing is:-" +
                                                                    itemName);
                                                    Pattern pattern = Pattern.compile(reg);
                                                    Matcher matcher = pattern.matcher(itemName);
                                                    StringBuffer buffer = new StringBuffer();
                                                    while (matcher.find()) {
                                                            matcher.appendReplacement(buffer, replacingtext);
                                                    int IndexOf = itemName.indexOf(".");
                                                    String domainName = itemName.substring(IndexOf);
                                                    System.out.println("domainName: "+domainName);
                                                    String finalimage = buffer.toString()+"_"+r+domainName;
                                                    System.out.println("Final Image==="+finalimage);
                                                    File savedFile = new File(getServletContext().getRealPath("assets/images/")+"/LowesFloorPlan.png");
                                                    //File savedFile = new File("D:/apache-tomcat-6.0.35/webapps/ROOT/example/"+"\\test.jpeg");
                                                    item.write(savedFile);
                                                    out.println("<html>");
                                                    out.println("<body>");
                                                    out.println("<table><tr><td>");
                                                    out.println("");
                                                    out.println("</td></tr></table>");
                                                    try {
                                                            out.println("image inserted successfully");
                                                            out.println("</body>");
                                                            out.println("</html>");  
                                                    } catch (Exception e) {
                                                            System.out.println(e.getMessage());
                                                    } finally {
                                            } catch (Exception e) {
                                                    e.printStackTrace();

    It is only coming in Windows 7 systems and the root of this problem is SSL certificate.
    Workaround for this:
    Open application in IE and click on certificate error link at address bar . Click install certificate and you are done..
    happy programming.
    Thanks
    DevSachin

  • Labview crashes when creating large image files

    I have a problem with Labview 6.0.2( I've tested evaluation version 7.0 too).
    I'm constructing a very large image, for example: 4500x4500 pixels. Labview crashes when converting the pixture to a pixmap. The image is fully constructed on my screen (in a picture control), but when converting it to a pixmap (for saving the image in a known format (bmp, jpg, tiff)), Labview crashes.
    I did some testing and when the number of pixels exceeded the limit of 2^24(16777216), the file 'image.cpp' crashes on line 1570. The vi to convert it to a pixmap is: P'icture to pixmap.vi'
    Does someone know a workaround for this problem? Or is there a fix for it?
    Thank you!

    I've tested the 6i version of my VI in Labview 7.0 evalutation version. It raised an error but not the same error:
    d:\lvworm\src\lvsource\compatexport.cpp(37) : DAbort: Called a routine not in the compatibility LVRT table
    $Id: //labview/branches/Wormhole/dev/lvsource/compatexport.cpp#11 $
    0x004BD4CB - LabVIEW_Eval + 0
    0x0EB710D9 - lvs248 + 0
    0x094C87A0 - + 0
    So i replaced the picture VI's with the 7.0 evalutation version VI's, and it worked. It is now possible for me to construct very large image files!
    I see no attached VI to test. But i guess it is also solved in Labview 7.0
    I used this file to convert the picture to image data:
    C:\Program Files\National Instruments\LabVIEW 7.0 Evaluation\vi.lib
    \picture\pictutil.llb\Picture to Pixmap.vi
    And this file to convert image data to bmp:
    C:\Program Files\National Instruments\LabVIEW 7.0 Evaluation\vi.lib\picture\bmp.llb\Write BMP File.vi
    I guess i have to write a workaround for this problem:
    divide the picture in blocks of 4096 x 4096 and then merge the image data arrays of the bloks together.

  • Unable to create large image file in iphoto

    I make a lot of panoramic images by stitching together overlapping images with hugin in TIFF format, then edit them in iphoto and use it to export them in jpeg format for greater versitality of use. So far I haven't had any trouble, even with large images as big as ~25,000x2000 pixels and ~90MB file size.
    However, I have come across a problem recently trying to export an edited version of a particularly large one - 38,062x1799 136MB file size. I can export an unedited version to jpeg without trouble, but when I tried to export an edited version it gave me an error saying it couldn't create the file. I looked around here for solutions and found the suggestion to check the file size in finder, it was zero. After experimenting I've found that the file size goes to zero as soon as any changes are made to the file, the edited image can still be viewed in iphoto, but does not display if you try to zoom in. I have tried making the TIFF file again with hugin to see if it is a problem with the file, but I also experience the same problem with the newly made file.
    I am working on a Macbook air (1.7 GHz Intel Core i5, 4 GB 1333 MHz DDR3 Memory, Intel HD Graphics 3000 384 MB, Version 10.7.5 OS X) in iphoto '11 version 9.2.2 (629.40). Any suggestions for getting round this issue?

    I thought of something that might help with working out what's going on.
    After I make any edits to the image, if I move on to the info tab the image initially goes a bit blurry as it often does with large images, but instead of coming up as a clear image after a few seconds like normal, I get the image I've attached here and the file size changes to zero. To be able to see the thumbnail of the edited image I need to come back out to the library again, but the file size is still zero. If I revert to original the file size is restored.

  • In Design CS 5.5, Can I link to large image files ??

    Dear friends,
    We are creating a 111 page file that links to about 120 large picture files.  Each picture file is about 40MB.  Doing this seems like it will slow down my computer's ability to quickly edit the file, because it will be so large.
    QUESTION: What are some ways around this?
    What is recommended?
    Shall I place the files, and then somehow TURN DOWN the "display performance" ?
    Or should I make SMALLER versions of all 120 files, then replace them with the large files when I am about to print to PDF?  (That sounds like a LOT of work)

    I'm not sure you are using the term "embedding" properly here.
    You want to link the images by using File > Place... That includes a pointer to the entire file and a preview inthe ID file, but not the actual image data. In order to embed (which saves a complete copy of the image data inside the ID file and will make it HUGE), you must slect the link in the Links panel and choose Embed from the panel flyout menu.
    It's also possible to copy and paste vector drawings into ID and the will not show in the links panel at all. I recommend against this in favor of Placing them as well. Complex pasted vector objects can cause significant slowdown whenver the screen has to be redrawn.

  • About to buy Panasonic HDC-SD9: Advice on handling large AVCHD files

    I'm getting married soon and want to buy a Hi-Def camcorder to use during the wedding ceremony and honeymoon. I like to Panasonic HDC-SD9 but have some basic questions. Please accept my ignorance!
    a) I understand that the HDC-SD9 records in AVCHD, which on conversion to AIC in IMovies 8 greatly increases the file size. The harddrive on my Macbook is only 120GB and a converted one-hour AVCHD video is likely to make a serious dent into this. What is the best way to handle this?
    b) Should I use the disk utility to make a copy of the SD card and delete the files from the SD card? How do I ressurect this when I need to use the raw footage? Can I play directly from the Disk Copy?
    c) Should I convert the large AIC file into another form to save it on my harddrive? What is the best format to reduce file size? How do I do this?
    d) Ideally, we want to give our wedding guests a copy of the wedding footage on a standard def DVD. Can we use IDVD to make a DVD from the AIC file? How much footage can be stored on a normal DVD?
    e) What programmes can be used to play the AVCHD file? Can I play it back on Quicktime 7.5.5?
    Many thanks

    If you go the AVCHD route; My suggestion to most of the questions is to buy an external hard drive (format to Mac OS Extended) and load your footage to that and keep the drive. Then guard it with your life.
    IMO: Tape is still an option that has advantages. The tape is a backup and they are inexpensive.
    Most NLEs work with the formats and file sizes are smallish. AVCHD can bloat to nearly 50 gig an hour, DV, no more than 13.5 gig.
    Al

  • ITunes 8.2 has problems handling larger music files

    Hi,
    After upgrading to iTunes 8.2, I have begun to notice that the program is halted with larger mp3 files. I have quite a lot of music files with size 200MB and above and before opening them, iTunes stops for about 10-15 seconds (with "beach ball" on the screen), then the playback starts normally. This is not an actual problem, but pretty annoying though. iTunes 8.2 behaves in the same way both on my black MacBook, as well as on my oldie G4 iMac running Tiger. Never noticed such a behaviour with earlier 8.1 version of iTunes.
    Has anyone encountered similar problem and could there be any remedy? Thanks in advance.
    Cheers,
    Matt

    iso_omena,
    I can't duplicate your problem here, so maybe there's something about your machine or the software installed on it. Do you have any iTunes plugins installed?
    You can also take a sample of iTunes while it's hung and send it to our good buddy, Roy, who will forward it to the right Apple engineers. To sample iTunes:
    1. Open the terminal app in /Applications/Utilities
    2. Get iTunes into the state where you have the spinning cursor and the music's not playing.
    3. At the command prompt in the terminal window, type in "sample iTunes 10". This will sample iTunes for 10 seconds and then tell you where it put the output file.
    4. You need to send that file to <[email protected]>. Make sure that you include:
    (1) the sample file.
    (2) the link to this thread.
    (3) a one line description of your problem, i.e. "iTunes crashes on launch".
    (4) the username that you're using here in the discussion boards.
    Roy is getting a little swamped with messages and needs to make sure he gets all the information. And if he doesn't get that information, the message is just going to get dropped on the floor.

  • Using IMAQ Image Display control vs IMAQ WindDraw for large image files

    Hello All;
    I am designing an application that is currently using IMAQ Image Display control to view large images (5K x 3K and larger).  My problem is that these images take 10-20 seconds to load and display, whereas if I use IMAQ WindDraw to display my image in a separate window, it only takes a couple of seconds.  My application is designed such that I am making use of the Subpanels in LabVIEW 8.0, and to make it pleasant for the user, the interface is such that my line profile, histograph and image viewer displays are contained within the same GUI (panel).
    I read the National Instruments application note regarding displaying of large images, and it did not seem to make a difference.  For example, I switched the 'modern' IMAQ Image Display control with the classic Image Display control, since the 'classic' does not contain any of the 3D rendering tools which might slow the process down.
    Why is there such a huge difference in loading times if I am trying to do exactly the same thing with both methods?  How can I bring the IMAQ Image Display control up to the same speed as the IMAQ WindDraw tool?
    I am currently using LabVIEW 8.0 with the latest IMAQ/NI Vision package from NI (IMAQ v7.1?).  Thanks.
    DJH

    Use a property node and select 16 bit image mapping. You can create a control for this or whatever you need. If you select the individual elements, you can get enumerated controls.
    Bruce
    Bruce Ammons
    Ammons Engineering

  • How to process Large Image Files (JP2 220MB+)?

    All,
    I'm relatively new to Java Advanced Imaging, so I need a little help. I've been working on a thesis that involves converting digital terrain data into X3D scenes for future use in military training and applications. Part of this work involves processing large imagery data to texture the previously mentioned terrain data. I have an image slicer that can handle rather large files (200MB+ jpeg files). But it can't seem to process jpeg 2000 data. Below is an excerpt from my code.
    public void testSlicer(){
    String fname = "file.jp2";
    Iterator readers = ImageIO.getImageReadersByFormatName("jpeg2000");
    ImageReader imageReader = (ImageReader) readers.next();
    try {
    ImageInputStream imageInputStream = ImageIO.createImageInputStream(new File(fname));
    imageReader.setInput(imageInputStream, true);
    } catch (IOException ex) {
    System.out.println("Error: " + ex);
    ImageReadParam imageReadParam = imageReader.getDefaultReadParam();
    BufferedImage destBImage = new BufferedImage(256, 256, BufferedImage.TYPE_INT_RGB);
    Rectangle rect = new Rectangle(0, 0, 1000, 1000);
    //Only reading a portion of the file
    imageReadParam.setSourceRegion(rect);
    //Used to subsampling every 4th pixel
    imageReadParam.setSourceSubsampling(4, 4, 0, 0);
    try {
    destBImage = imageReader.read(0, imageReadParam);
    } catch (IOException ex) {
    System.out.println("IO Exception: " + ex);
    The images I am trying to read are in excess of 30000 pixels by 30000 pixels (15m resolution at 5 degrees latitude and 6 degrees longitude). I continually get an OutOfMemoryError, though I am pumping up the heap size to 16000MB when using the command line.
    Any help would be greatly appreciated.

    Hi,
    In general, an extra sizing for XI memory consumption is not required. The total memory of the SAP Web Application Server should be sufficient except in the case of large messages (>1MB).
    To determine the memory consumption for processing large messages, you can use the following rules of thumb:
    Allocate 3 MB per process (for example, the number of parallel messages per second may be an indicator)
    Allocate 4 kB per 1kB of message size in the asynchronous case or 9 kB per 1kB message size in the synchronous case
    Example: asynchronous concurrent processing of 10 messages with a size of 1MB requires 70 MB of memory
    (3MB + 4 * 1MB) * 10 = 70 MB With mapping or content-based routing where an internal representation of the message payload may be necessary, the memory requirements can be much higher (possibly exceeding 20 kBytes per 1kByte
    message, depending on the type of mapping).
    The size of the largest message thus depends mainly on the size of the available main memory. On a normal 32Bit operating system, there is an upper boundary of approximately 1.5 to 2 GByte per process, limiting the respective largest message size.
    please check these links..
    /community [original link is broken]:///people/michal.krawczyk2/blog/2006/06/08/xi-timeouts-timeouts-timeouts
    Input Flat File Size Determination
    /people/shabarish.vijayakumar/blog/2006/04/03/xi-in-the-role-of-a-ftp
    data packet size  - load from flat file
    How to upload a file of very huge size on to server.
    Please let me know , your problem is solved or not..
    Regards
    Chilla..

  • How to handle large images?

    Hi,
    Does anyone know how to handle big jpg images (1280*960) so that they could be presented in a midlet.
    The problem is that the images requires so much memory that they can't be decoded to an Image object with Image.createImage method. One solution would be to extract thumbnail image from exif headers. Unfortunately at least images taken with Nokia 6680 don't contain thumbnail in exif headers.
    So the only solution seems to be to decode the byte presentation of the image and resize it before creating an Image object.
    Do anybody know any library for this or tips where to start?
    Br, Ilpo

    Hi,
    I think it is not possible. My application contains a file browser (which uses jsr-75). User can use the browser to select an image either from phone memory or memory card. After the selection I would like to present the selected image for that user can be sure it is the right image. The selected image will be then sent to the server side with some additional data for further processing (but that is another story).
    Now the problem is that for example with Nokia 6680 user can take images as big as 1280*960 and I can't present them anymore because of the memory restrictions. With 640*480 image there is no problem because I can create an image object and then use a simple algorithm to resize the image for presentation.
    Br, Ilpo

  • Best mac pro specification to handle large photoshop files

    I'm about to buy a mac pro and want some advice on where I should be spending my money to get the best performance?
    I use Photoshop to create large pieces of artwork (over 2m square) that contain approx 100 layers and require a 300dpi resolution for printing. File sizes are around 8bg before flattening.
    I have tried using the standard demo mac pros available to handle files but with little success. I assume I need to boot up the ram (from Crucial) but how much should I consider, is it worth getting faster processor, separate hard drive for scratch disk, better graphics card etc?
    Obviously all but with limited funds which are more important?
    Thanks in anticipation!
    Jacqui

    Head over to http://www.macgurus.com and look down the left side toward bottom for the Guide to Photoshop Acceleration.
    http://homepage.mac.com/boots911/.Public/PhotoshopAccelerationBasics2.4W.pdf
    I would boot from a stripped array, a pair of 10K Raptor 300GB model.
    http://barefeats.com/hard103.html
    Then upgrade to 16GB or 32GB of memory.
    http://www.barefeats.com/harper3.html
    External RAID for scratch that could be 2, 4 or more drives, and another RAID for saving your work.
    And with RAID, you want a couple backups for non-temp files of course.
    http://www.barefeats.com/hard101.html - WD VelociRaptor
    http://www.barefeats.com/harper13.html - 4 drive RAID: SAS vs SATA
    http://www.barefeats.com/hard94.html - 1TB drives
    http://www.barefeats.com/harper14.html - WD 640GB Caviar
    http://www.barefeats.com/harper9.html - Boot drives

  • Firefox does not handle large bookmark "files" gracefully, it HANGS

    Essentially large #'s ( massive ) cause Firefox to go off the deep end and never shut down in any reasonable time. I tried backups in JSON and HTML, files produced took ~2-3 days, 50mbyte and ~90mbytes each. Essentially when you exceed past some indeterminate bookmarks "size" Firefox never gracefully in some useful time frame shut down, folks here have said when firefox is attempted shutdown and if it does not, then go to task manager and shut the Firefox process. THat is less desireable. Question is when firefox closes, ( request to shutdown ) is it trying to do a whole host of task cleaning ./ bookmarks backup? I tried cutting back my books marks slightly, now FF can be closed without Taskmanager shutdown, but now none of my addons ( that I disabled and deleted some ) none seem to restart ( Firefox seems to thrash CPU utilization from near zero, to ~100% via trying to restart NoScript, and pounds away for many minutes, until I say this is not good ).
    I have Vista Business 32, configured to work well latest FF 26.0.
    Chrome ( SR Ironware Iron browser ) that I have zero bookmarks runs fine, I need FF to work reconfigured without a reinstall due to all the passwords I have loaded into the FF over time.
    GIven that a minor ?10-20% reduction in Bookmarks that took me a few hours to do, returns the ability to not have to go to taskmanager to brutally shut the FF task when an exit from FF does not remove it and it s large memory footprint, and now mostly does, with minor reduction of a massive bookmarks set, this is weird. FF never exceeds my win 32 memory, largest footprint I had I think was ~1gb in memory from lots of web pages.
    But presently I cannot seem to restart Noscript... which is troubling

    I am not a software engineer, merely a user, I am not asking for help for me to personally fix what is a fundamentally a product software problem.
    I'd also point out that in addition to Hanging in backups, 3 days actually of FF running the bookmarks backup that produced 50mb JSON file and 90 mb HTML backup file, after 1/2 day respite of actually being able to have firefox close on user exiting after i completed the backup and reduced bookmarks slightly ( ~10% deleted ), now it is back to its usual self, taking forever to shut down, looking like a memory hog monster with >0.5 GB ram footprint, and often grows upon exit that the program persists in Task Manager Process List ad nauseum, till a manual task manager forced shutdown.
    A user ( not a software coder ) should not have to twiddle with arcane settings buried in some obscure file undocumented, to avoid having a program hog their machine for 3 days just to backup Bookmarks manually ( meaning the code in the bookmarks backup procedure is coded rather inefficiently when scaled to large # of bookmarks ),
    and I suspect FF shutdown internal procedure similarly does some rather inefficient logging to needlessly find grained writing to files on disk upon request by user to EXIT ( hopefully shut in reasonable time to free the dual core CPUs from ?50-100% utilization for days to backup / save stuff I could care not enough to wait for days for backup completion.
    If RAM was used to assemble the backup in memory entirely, then to enable doing a small series of long serial block writes to disk, I suspect this all might go smoother than what appear to be a huge series of small writes to the disk as present it seems.
    Possibly less sensible at present, and I can tell this by watching the disk activity light slowly flashing and Task Manager CPU activity spiking 100% and dropping regularly ( at high frequency ) for hours or days in the case of manual backup.
    Load ALL the data to RAM instead and do one large sequential write ( sort of obvious for performance of the code and avoiding days to backup bookmarks ? ) There is plenty of room to load the JSON or HTML directly all to ram, and massage, and dump in one long faster sequential write to disk? Or am I missing something?
    The justifications explained are almost silly, as I merely want to use FF, not make a career of learning undocumented buried feature settings. I have better things to do with my time, and I will remark the issues I describe might not be unique to FF's browser, it might also be common on other broswers ( I do not know, but I will be gracious enough to say so ).
    Granted the program is free, but memory and task management efficiency is mediocre for truly heavy users. I read a lot and have sometimes 100 tabs open, the manner this is handled with FF sometimes brings a dual core CPU in Win Vista ( optimized via reg settings recommended in deeper articles about VIsta to be more XP like in operation ) to its knees.
    This reminds me of the creeping featurization of GMAIL where originally GMAIL was fast and efficient, impressively so, FF might not properly vet ADDONS for good memory and shut down behavior and might consider doing so, to block those which affect stable operation, from being downloaded, as might actually testing / validating the safety & security & privacy of some addons ( rather than presently leaving to unsophisticated users to do so haphazardly ).
    just some obvious observations.
    It should not take common users to fiddle with undocumented obscure features ( hidden settings in undocumented files ) to make a quality product ( FF is good ) run properly. There is something to think about here.

  • Help needed trying make thumbnails from large Image Files

    I posted this to the Flex message bored, too. I didn't notice
    the Actionscript 3.0 until after. Sorry. Anyway, I have some
    ActionScript 3.0 code, created with Flex that will load a large
    JPEG image (say 3000x2000 pixels) that I'm trying to create a 100
    pixel thumbnail. I have the code working where I generate the
    thumbnail, but it's not maintaining the aspect ratio of the
    original image. It's making it square, filling in white for the
    part that doesn't fit.
    I've tried just setting the height or width of the new
    image, but that doesnt render well, either.
    To see what I'm talking about, I made a screen shot, showing
    the before image, and the rendered as thumbnail image:
    Image of demo
    app at Flickr.
    Now, there's a few things important to note. I'm saving the
    thumbnail off as a JPEG. As you can see in my sample application,
    the original renders fine with the proper aspect ratio, it's when
    I'm copying the bytes off the bitmapdata object, where I need to
    specify a width and height, that the trouble starts. I've also
    tried using .contentHeight and .contentWidth and some division to
    manually specify a new bitmapdatasize, but these values seem to
    always have NaN.
    private function makeThumbnail():void{
    // create a thumbnail of 100x100 pixels of a large file
    // What I want to create is a a thumbnail with the longest
    size of the aspect
    // ratio to be 100 pixels.
    var img:Image = new Image();
    //Add this event listener because we cant copy the
    BitmapData from an
    /// image until it is loaded.
    img.addEventListener(FlexEvent.UPDATE_COMPLETE,
    imageLoaded);
    img.width=100;
    img.height=100;
    img.scaleContent=true;
    img.visible = true;
    // This is the image we want to make a thumbnail of.
    img.load("file:///C:/T5.jpg");
    img.id = "testImage";
    this.addChildAt(img, 0);
    private function imageLoaded(event:Event):void
    // Grab the bitmap image from the Input Image and
    var bmd:BitmapData =
    getBitmapDataFromUIComponent(UIComponent(event.target));
    //Render the new thumbnail in the UI to see what it looks
    theImage.source = new Bitmap(bmd); //new Bitmap(bmd);
    public static function
    getBitmapDataFromUIComponent(component:UIComponent):BitmapData
    var bmd:BitmapData = new
    BitmapData(component.width,component.height );
    bmd.draw(component);
    return bmd;
    }

    Hi Tod,
    Take a look at this post:
    I'll have to say the smoothing of the thumb is not perfect, but maybe in combination with your code.. you get good results.
    Post a solution if you acheive better smooth thumb results.
    Link to forum post:http://forums.adobe.com/message/260127#260127
    Greets, Jacob

  • Will more RAM help handle large Excel files?

    Hi everyone,
    I know this message may be better suited to the Office for Mac Forums website, but the site hasn’t been able to validate me as a new user so far (it's been weeks of trying!), and I’m therefore unable to ask a question. So I have to resort to annoying you here.
    I run Excel 2008 (12.1.7) on a 8-core Mac Pro (early 2008) with 6 Gb of RAM (OS 10.5.6). I encounter a lot of problems working with big files, such as a 385 Mb file comprised of over 600,000 row and 50 columns of mostly data (only a few columns are filled with simple formulas).
    After loading the file (which took quite a while), the Activity Monitor indicated that Excel was using 1.12 Gb of Real memory an 2.11 Gb of Virtual, while 3.66 Gb of RAM was free. When I then attempted to create a PivotTable with this data, the amount of memory used by Excel gradually rose to 1.94 Gb of Real an 2.93 Gb of Virtual (2.85 Gb of Free memory was still available) while reading the data. At this point I then got a message stating that there’s not enough memory. I had to resort to breaking the file up into smaller ones to analyze the data, which is quite annoying.
    I was told that being a 32-bit application, Excel 2008 was limited to handling 4 Gb of RAM. But I thought that 4 Gb being way more than the size of the file on disk, and having 6 Gb of RAM on my machine, I shouldn’t have any problem working with it. Obviously I was wrong. Can someone please clarify what’s going on here? Would more RAM significantly help?
    Any help will be greatly appreciated. Thank you!

    Yes, I bet the Windows version of Excel is better. I think it’s definitely time to try it.
    When you say “having 8 DIMMs”, do you mean filling up the 8 memory slots, or having 8 Gb? I believe I remember seeing that 10 Gb was the sweet spot, with all 8 memory slots occupied. Is that right? And what brand of RAM do you like currently, still OWC?
    Thank you Hatter.

Maybe you are looking for

  • Can I backup my phone but link to a new icloud account

    I want to backup my iphone 6 from a previous iphone 5. That phone was linked to a different icloud account, will I be linked to the old Icloud account or can I use my new one ?

  • Encoding Fonts from InDesign to UTF-8 for PDF SEO

    I have the topic that I export documents out of InDesign CS3 into a PDF file. Does anybody know how to adjust the font encoding to UTF-8. Now I always get the ANSI coding in each PDF. The result is that the PDFs can't be accessed and indexed by the s

  • Satellite L750/04K Keyboard not registering every keystroke

    I have just purchased a L750/04K laptop but the keyboard does not seem to be registering every keystroke. Often you have to press the space a couple of times, or other keys as well. It doesn't matter how fast or slow I type it happens randomly. Any h

  • Camera problem: photo bar changed place and automatically rotates horizontal

    Hi everybody, I've searched the forum but I didn't find a similiar topic that describes my problem... After the update for Xperia 2.3.3 I'm having troubles with my camera. I CAN take photo's and video's but as soon as I've made a photo the miniature

  • ACS5.2 work with AD2008

    Hi My ACS5.2 work with AD2008 for the 802.1x  ,There is a script in the 2008 , WIN 7 can pass the 802.1x and get the script successful But the XP can not pass get the script , Useing the cisco old client --cisco security client  ,the XP can get the s