Working with large images

i need to work with very large images (3000x3000 pixels) and they take a lot of memory (50MB). thing is that they don't use a lot of colors but they are loaded in memory with 24bit color model.
it is jpeg image and it is decoded to buffered image with standard jpeg decoder. i want to use buffered iamge because it is very simple to work with, however, i need to find a way to reduce memory consumption.
is it possible to somehow use different color model to reduce color count to 256, which is enough for me. i am clueless about imaging and i simple haven't got a time to get into it. i need some hints about this. i just want to know is it poosible (and how) to reduce size of BufferedImage in memory by decreasing it's number of colors.
i was thinking about using gif encoder, but it works with images, not bufferedimages and conversion takes lot of time and is not a solution.
thanx!
dario

You can create a byte indexed BufferedImage. This uses just 1 byte per pixel.
Use
public BufferedImage(int width,
int height,
int imageType,
IndexColorModel cm)
with
imageType == BufferedImage.TYPE_BYTE_BINARY
and create a color palette with
new IndexColorModel()

Similar Messages

  • Illustrator Image Trace doesn't work with large images

    Illustrator's Image Trace doesn't seem to work at all with large images.  The attached image shows the following:
    A) The original raster cropped tightly.
    B) Converting the raster with default "Black and White Logo" settings.  Results in 242 paths and 4792 anchors.
    C) Adding a massive amount of white space and then converting the raster with default "Black and White Logo" settings.  Results in 407 paths and 1620 anchors.
    For whatever reason Illustrator can't seem to handle large images.  This test shows it's not an issue with the image being too complex since all I've done is add white space and the file was saved without compression so there should be no noise.
    This type of glitch started with CS6 with the new Image Trace tool.  Is there a fix for this?  Maybe setting Image Trace into legacy mode?

    Moving to Illustrator forum.

  • Best way to work with large image sequences

    Hello,
    Im relatively new to Adobe CS4. I am learning to create detailed animations with another software that produces thousands of still images. (I normally save them as  .png, 720 x 480, 29.97 fps) I spent the money to buy CS4 becuase I was told this was what I needed to turn those images into video, add titles, captions, credits, etc. So far, I've been useing PremierPro to import the image sequences and add titles. But when I export the sequences, the quality is crap. I've tried all combinations of exporting formats, but even when I try to go for the greatest quality, my files are 100 MB large, and the quality is still crap (nice smooth lines are jagged, and details are blurry and pixilated, even with the titles I added). Can someone tell me how I should be working with these still image sequences so that I can retain the best quality while keeping the file size large. These videos are just played on local computer systems, not over a network or on DVDs. I dont know if I should be bringing the shorter clip sequences into AfterEffects and somehow exporting those. I just dont know. But nothing is working for me. Help!
    Nikki

    Hello!,
    The output frame size has so far been the same as my rendering size (720 x 480), but I's just trying to get the best quality, if I need to render larger I can. Delivery has so far just been video, no audio, although when I get better at this I will record narrations of the animations and put that in. Playback is intended only for computers. Every once in a while, it is requested in a format that will be played over our network, but I wanted to worry about that later.
    The preset I chose for the sequence was the DV-NTSC Standard 32kHz
    The project/sequence settings are:
    General
         Editing Mode: DV NTSC
         Timebase: 29.97 fps
    Video
         Frame size: 720, 480 4:3
         Pixel Aspect Ratio: D1/DV NTSC (0.9091)
         Display Format: 30 fps Non-Drop-Frame Timecode
    Audio
         doesnt matter
    Video Previews
         Preview File Format: NTSC DV
         Codec: DV NTSC
         Maximum Bit Depth
    Hope that was the information you requested.
    Nikki

  • Working with large images in Motion 3

    Hi, I'm trying to work with a large image that I am zooming in and out of and moving to different parts of it as a 3d layer, but Motion won't let it import at its full resolution. I don't mind working with a lower resolution as I animate stuff, but for the final output, it shouldn't matter if I'm using a larger image than 2896 pixels. That limit is bogus! I understand I can't expect to get speedy results, but to limit it entirely is ridiculous.
    Am I missing something here? It seems there are a couple options. 1, use the vector version and uncheck the fixed resolution box, but that takes forever to render. 2, I can cut up the map into like 10 different squares that are under the limit, which is time consuming and ridiculous because if I make an edit to the original file, I'm going to have to re-cut it all out again.
    Any tips people? I know I'm not the only one out there trying to do this kind of stuff in Motion! I'm really trying to do this all in Motion because it should be capable, but maybe I'm gonna have to go back into AfterEffects.

    Am I missing something here? It seems there are a couple options. 1, use the vector version and uncheck the fixed resolution box, but that takes forever to render. 2, I can cut up the map into like 10 different squares that are under the limit, which is time consuming and ridiculous because if I make an edit to the original file, I'm going to have to re-cut it all out again.
    Vector art? Really? Missed that in your OP. Forget the vectors; rasterize it in Illustrator to a TIFF at 3k pixels.
    But you'll need to carefully evaluate your need for zooming in seamlessly. I think most of us get around the limits of our tools that butt up against our imaginations by artfully using delicate transitions from one resolution to the next.
    Not knowing exactly what you're trying to do and what your screen presentation is expected to look like makes it difficult to suggest workarounds you would accept.
    bogiesan

  • Need Oil Paint filter enhancement to work with large images

    As Adobe is not supporting the Oil Paint filter with the next version of PhotoShop, can the code for this filter be made available for open source development?
    I desperately need to utilize effects that can be created using Oil Paint, but I have to have the code altered so I can apply these effects to relatively large image sizes.
    The current filter works fine for relatively small image sizes around 500kb or so, but has little or no effect on images of around 100mb.
    I need to have this capability. I would like to have the code of the current filter altered to be able to achieve this.
    I realize I will need to continue using PhotoShop 6 as the code will not be supported with the next version of PhotoShop.
    If this is not the correct forum to post this, please let me know where I should be asking this question.
    Thank you in advance for any insight and advice you may have.

    What's the problem you're having with applying it to an 80 MB image?  Is it that the effects don't scale up enough?
    Technically it can run on large images if you have the computer resources...  I've just successfully applied it to an 80 MB image, and with the sliders turned all the way up it looks pretty oil painty, though it kind of drops back into a realistic looking photo when zoomed out...
    If it's just that the sliders can't go high enough, given that it's a very abstract look you're trying to achieve, have you considered applying it at a downsized resolution, then upsampling, then maybe applying it again?  This is done that way...
    Oh, and by the way, Oil Paint has been removed from Photoshop CC 2014, so if you're planning a business based on automatically turning people's large photos into oil paintings you should assume you'll be stuck with running the older Photoshop version.
    -Noel

  • Beachballs & Hangups with large images in CC.

    I wasn't having this issue in CS6, and then I upgraded to CC. I couldn't work with large image files on my art board. I'd always beach ball while copying an image! It was crapping out my workflow, and I have more than enough ram and processing power.
    Solution:
    Turns out the primary reason for the hang-ups was caused by the 'included SVG code' option (which doesn't exist in CS6) in the File Handling Preference. I unchecked it, and it's been smooth sailing since.
    I hope this helps someone!

    Well, your resolution in Photoshop has only some relation to the size of your document in InDesign. Perhaps CS4 has something new that CS3 doesn't, but I can't set an Indy doc to 1027 x 768 pixels; the only measurements I get are things like picas, inches, and other non-screen measurements.
    When working in Photoshop (really not my strong point), you have control over both the image size and the canvas size, right? And one of the values you're working with is going to be pixels per inch, no? I'm guessing that a Photoshop image with a size of 1024 x 768 pixels at 72 ppi (screen resolution) is what you are placing in an InDesign document, which is simply not going to display that information at 72 ppi. It'll probably display it at 300 ppi, or something similar, no?
    This just shows you my basic ignorance of Photoshop. However, even someone as PS-impaired as myself knows that you can set the image size and canvas size in Photoshop, and this will control your number of pixels per inch. If you save a PSD or a TIFF out of PS, and you have your canvas and image settings correct, then you'll get a good idea onscreen of how large your image is inside your ID document, and what your resolution will be. In fact, InDesign will merrily let you place PSDs at whatever resolution you set up in PS. So, to sum up:
    1) Your image is being placed at a print-level resolution, which is why it's so small
    2) You determine the resolution of your image in Photoshop
    3) Bob's question bears repeating: 1024 whats?

  • Working with large Artboards/Files in Illustrator

    Hello all!
    I'm currently designing a full size film poster for a client. The dimensions of the poster are 27" x 40" (industry standard film poster).
    I am a little uncertain in working with large files in Illustrator, so I have several problems that have come up in several design projects using similar large formats.
    The file size is MASSIVE. This poster uses several large, high-res images that I've embedded. I didn't want them to pixelate, so I made sure they were high quality. After embedding all these images, along with the vector graphics, the entire .ai file is 500MB. How can I reduce this file size? Can I do something with the images to make the .ai file smaller?
    I made my artboard 27" x 40" - the final size of the poster. Is this standard practice? Or when designing for a large print format, are you supposed to make a smaller, more manageable artboard size, and then scale up after to avoid these massive file sizes?
    I need to upload my support files for the project, including .ai and .eps - so it won't work if they're 500MB. This would be good info to understand for all projects I think.
    Any help with this would be appreciated. I can't seem to find any coherent information pertaining to this problem that seems to address my particular issues. Thank you very much!
    Asher

    Hi Asher,
    It's probably those high-res images you've embedded. Firstly, be sure your images are only as large as you need them Secondly, a solution would be to use linked images while you're working instead of embedding them into the file.
    Here is a link to a forum with a lot of great discussion about this issue, to get you started: http://www.cartotalk.com/lofiversion/index.php?t126.html
    And another: http://www.graphicdesignforum.com/forum/archive/index.php/t-1907.html
    Here is a great list of tips that someone in the above forum gave:
    -Properly scale files.  Do not take a 6x6' file then use the scaling tool to make it a 2x2'  Instead scale it in photoshop to 2x2 and reimport it.  Make a rule like over 20%, bring it back in to photoshop for rescaling.
    -Check resolutions. 600dpi is not going to be necessary for such and such printer.
    -Delete unused art.  Sloppy artists may leave old unused images under another image.  The old one is not being used but it still takes up space, therefore unecessarily inflating your file.
    -Choose to link instead of embedd.  This is your choice.  Either way you still have to send a large file but many times linking is less total MB then embedding.  Also embedding works well with duplicated images. That way multiple uses link to one original, whereas embedding would make copies.
    -When you are done, using compression software like ZIP or SIT (stuffit)
    http://www.maczipit.com/
    Compression can reduce file sizes alot, depending on the files.
    This business deals with alot of large files.  Generally people use FTP's to send large files, or plain old CD.  Another option is using segmented compression.  Something like winRAR/macRAR or dropsegment (peice of stuffit deluxe) compresses files, then breaks it up into smaller manageble pieces.   This way you can break up a 50mb file into say 10x 5mb pieces and send them 5mb at a time. 
    http://www.rarlab.com/download.htm
    *make sure your client knows how to uncompress those files.  You may want to link them to the site to download the software."
    Good luck!

  • Problem converting XPS document with large images

    Hi,
    When I try to convert XPS documents to PDF in Acrobat 9.0 Pro, I have problems with "large images". They are either truncated or scaled down. The only images that convert correctly are small images that are not stretched to fill a large area. Attached to this message is a simple example. Does anyone know a way to correct this problem? By the way, the page is slightly larger than the standard Letter size.
    Thanks!

    Thanks for your suggestion! In fact, I am the programmer who creates the XPS files from a .NET document. Our company used the method "Postscript + PDF" during a few months and it worked quite well. Recently, we introduced semi-transparent objects and fonts in our documents and the resulting PS prints look really bad. Since I can create a perfect XPS document, and since Acrobat can convert it to PDF, I gave it a try. It solved all the problems that I had with PS printing but there is that thing with images... Anyway, since Acrobat converts XPS to PDF, I guess they should try to solve the bug with the images. As for me, I will see if I can get a component to create PDF documents directly from our application.

  • Spot removal tool lethargy with large image

    Question: is there a way to speed up Spot Removal in my circumstance?
    Background: 16-bit 4x5 scanned B & W image - 1.04GB .dng file
    Win 7, 64-bit, 12 GB RAM, dual CPUs
    With large images, spot removal is very slow, and becomes exceedingly slow as the number of spots increases.
    I have Lightroom 5 configured to use 10 GB cache
    I have cleared the cache, closed other apps, re-booted windows, with no increase in speed
    Windows task manager reports 0-35% cpu load, 4-6GB RAM used including the 1.5 GB consumed by the OS
    I do know that file size may be an issue -- Lightroom reported that a 1.32GB image is too large to load.
    What will work, if anything, to speed this up?

    PS CS5 and Camera Raw 6.x don't support the new features in LR 5. You need to use LR 'Edit in PS' and then select 'Edit a copy with Lightroom Adjustments.' Is that what you are doing? I'm using LR5.2 Final with PS CS6 and ASR 8.2 and it appears to work fine.
    However I am expereincing the feathering issue with LR5.2 Spot Removal tool:
    http://feedback.photoshop.com/photoshop_family/topics/lr_5_2_is_not_rendering_correctly_ph otos_with_spot_removal_applied_in_lr4

  • Have discovered how to split visual and audio tracks. However the cut away only seems to work with still images from iPhoto!

    trying to cut away from one video clip to a second one, while keeping the original sound track, works with still images imported from iPhoto, but cant get the video to drag/drop. please help thanks

    simonjacobson wrote:
    Drag clip over target, green plus sign appears.........clip appears at front of video, no menu option appears,
    OK that sounds correct. Just to clarify, when the sign appears, as you release the mouse button (or whatever you are using), no menu pops up but the clip immediately positions itself:
    at the front of the target clip, OR
    at the very beginning of your project
    Not sure where I'm heading here, just trying to get the complete picture.
    I di it once, but not again, do i need to double click/right click or something?
    No, there is no double-click or right-click involved at all. Just drag and drop, then select from the pop-up menu. You should be able to repeat this with no issues. I've also tested dropping video onto a still picture - the cutaway works on that as well.
    Something is definitely not working correctly - strange! Quit iMovie, then try deleting the preference file as I described in my last post. Quitting and relaunching iMovie might even fix it. Also, try restarting your Mac - that sometimes helps.
    John

  • Crash While Working With an Image Sequence

    So I am working with an image sequence in my timeline and every time I touch that clip (either to render it or move it) Premiere crashes.
    I am working on an iMac with 32 GB of Ram with OS X 10.10.3 installed. I recently upgraded from Premiere Pro CS6 to CC and since
    then I am unable to work with it.
    Here is the error log I keep getting:
    Process:               Adobe Premiere Pro CC 2014 [839]
    Path:                  /Applications/Adobe Premiere Pro CC 2014/Adobe Premiere Pro CC 2014.app/Contents/MacOS/Adobe Premiere Pro CC 2014
    Identifier:            com.adobe.AdobePremierePro
    Version:               8.2.0 (8.2.0)
    Code Type:             X86-64 (Native)
    Parent Process:        ??? [1]
    Responsible:           Adobe Premiere Pro CC 2014 [839]
    User ID:               501
    Date/Time:             2015-04-18 16:36:11.311 -0700
    OS Version:            Mac OS X 10.10.3 (14D136)
    Report Version:        11
    Anonymous UUID:        06821F38-EC27-D4F0-775A-7CB2D2257EE6
    Time Awake Since Boot: 3900 seconds
    Crashed Thread:        24  Dispatch queue: opencl_runtime
    Exception Type:        EXC_BAD_ACCESS (SIGSEGV)
    Exception Codes:       KERN_INVALID_ADDRESS at 0x0000000000000008

    It seems to happen when I am not using any tools. When I have a photo open for any purpose.
    It occurs with no input.  I fix it by zooming back to where I was.
    I am not working with a tablet.  I am not working with a mouse.
    I will need to look up these preferences. I did not know they existed.
    Thanks.
    Duane

  • Working with grey images

    Hey guys, got a new question for you. I'm working with grey images from an old newspaper in the 40's in photoshop, originally the format was working gray - dot gain 20%. My question is since this is a gray image with no cmyk to be found, what would you suggest would be the ideal format so that when these images are sent to the printer they will come out looking good? Thanks in advance guys.
    Jason

    What I'm doing for JPEGs:
    72 ppi at current size
    144 ppi at half size
    215 ppi at 1/3 size
    288 ppi at 1/4 size
    and even lower if I can get the images that small, but these are intended for a small brochure so I figure they will be okay at around 1 inch wide per placed image in indesign.

  • Working with Large Numbers

    Hi there,
    I am currently doing a school assignment and not looking for answers but just a little guidance.
    I am working with large numbers and the modulo operator.
    I might have some numbers such as :
    int n = 221;
    int e = 5;
    int d = 77;
    int message = 84;
    int en = (int) (Math.pow(message, e) % n);
    int dn = (int) (Math.pow(en, d) % n);Would there be a better way to do this kind of calculation. The dn value should come out the same as message. But I always get something different and I think I might be losing something in the fact that an int can only hold smaller values.

    EJP wrote:
    It might make sense in some contexts to have a positive and negative infinity.
    Yes, perhaps that's a better name. Guess I was harking back to old COBOL days :-).(*)
    But the reason these things exist in FP is because the hardware can actually deliver them. That rationale doesn't apply to BIgInteger.Actually, it does. All I'm talking about is a value that compares higher or lower than any other. That could be done either by a special internal sign value (my slight preference) or by simply adding code to compareTo(), equals() and hashCode() methods that takes the two constants into account (as they already do with ZERO and ONE).
    Don't worry, I'm not holding my breath; but I have come across a few situations in which values like that would have been useful.
    Winston
    Edited by: YoungWinston on Mar 22, 2011 9:07 AM
    (*) Actually, '±infinity' tends to suggest a valid arithmetic value, and I wasn't thinking of changing existing BigInteger/BigDecimal maths (except perhaps to throw an exception if either value is involved).

  • Working with Large List in sharepoint 2010

    Hi All
    I have a list with almost 10k records in my sharepoint list and based on some business requirement i am binding (almost 6k records) the data to asp.net grid view and this will visible on the home page of the portal where most of the users will access. Can
    someone please guide the best method to reduce the performance inorder the program to hit the SP list everytime the page loads...
    Thanks & Regards
    Rakesh Kumar

    Hi,
    If you are Working with large data retrieval from the content database (SharePoint list), the points below for your reference:
    1. Limit the number of returned items.
    SPQuery query = new SPQuery();
    query.RowLimit =6000; // we want to retrieve2000 items
    query.ListItemCollectionPosition = prevItems.ListItemCollectionPosition; // starting at a previous position
    SPListItemCollection items = SPContext.Current.List.GetItems(query);
    2. Limit the number of returned columns.
    SPQuery query = new SPQuery();
    query.ViewFields = "";
    3. Query specific items using CAML (Collaborative Markup Language).
    SPQuery query = new SPQuery();
    query.Query = "15";
    4.Use ContentIterator class
    https://spcounselor-public.sharepoint.com/Blog/Post/2/Querying-a--big-list--with-ContentIterator-and-multiple-filters
    5. Create a Stored Procedure in Database to get the special data, create a web service to get the data, when create a web part to show the data in home page.
    Best Regards
    Dennis Guo
    TechNet Community Support

  • Is anyone working with large datasets ( 200M) in LabVIEW?

    I am working with external Bioinformatics databasesa and find the datasets to be quite large (2 files easily come out at 50M or more). Is anyone working with large datasets like these? What is your experience with performance?

    Colby, it all depends on how much memory you have in your system. You could be okay doing all that with 1GB of memory, but you still have to take care to not make copies of your data in your program. That said, I would not be surprised if your code could be written so that it would work on a machine with much less ram by using efficient algorithms. I am not a statistician, but I know that the averages & standard deviations can be calculated using a few bytes (even on arbitrary length data sets). Can't the ANOVA be performed using the standard deviations and means (and other information like the degrees of freedom, etc.)? Potentially, you could calculate all the various bits that are necessary and do the F-test with that information, and not need to ever have the entire data set in memory at one time. The tricky part for your application may be getting the desired data at the necessary times from all those different sources. I am usually working with files on disk where I grab x samples at a time, perform the statistics, dump the samples and get the next set, repeat as necessary. I can calculate the average of an arbitrary length data set easily by only loading one sample at a time from disk (it's still more efficient to work in small batches because the disk I/O overhead builds up).
    Let me use the calculation of the mean as an example (hopefully the notation makes sense): see the jpg. What this means in plain english is that the mean can be calculated solely as a function of the current data point, the previous mean, and the sample number. For instance, given the data set [1 2 3 4 5], sum it, and divide by 5, you get 3. Or take it a point at a time: the average of [1]=1, [2+1*1]/2=1.5, [3+1.5*2]/3=2, [4+2*3]/4=2.5, [5+2.5*4]/5=3. This second method required far more multiplications and divisions, but it only ever required remembering the previous mean and the sample number, in addition to the new data point. Using this technique, I can find the average of gigs of data without ever needing more than three doubles and an int32 in memory. A similar derivation can be done for the variance, but it's easier to look it up (I can provide it if you have trouble finding it). Also, I think this funtionality is built into the LabVIEW pt by pt statistics functions.
    I think you can probably get the data you need from those db's through some carefully crafted queries, but it's hard to say more without knowing a lot more about your application.
    Hope this helps!
    Chris
    Attachments:
    Mean Derivation.JPG ‏20 KB

Maybe you are looking for

  • HT1661 How can I restart my mac book pro 11 inches when it's on target disc mode

    How can I restart my mac book pro 11 inches when it's on target disc mode

  • Running FDS2 Express as a service on Linux (Fedora Core)

    Hello, I've just installed the Flex Data Service 2 Express on a dedicated server running on linux (Fedora Core). I can run the server and everything works fine but as soon as I loggout, the server stops running. I believe I need to run as a "backgrou

  • Flagging Mail As Junk

    Well, Apple seems intent on becoming as difficult to use and as convoluted as anything MicroSoft has ever created. On my iPad 2, I can't flag an email as junk. That's right,mi get junk mail, but I can't actually FLAG it as junk because Apple, like Mi

  • Recording audio in motorola iden phones

    Hi all, I'm currently developing a midlet to record and play back audio on iden phones. However, I'm getting: MediaException: Player cannot be prefetched when during the recording part. I'm using i605 model, and have checked it against the developer'

  • Corrupted Lightroom 4 File

    Hello I purchased and downloaded Lightroom 4 from the App store to my Macbook last year.  However, there is now an error with the file and I get a message saying to delete the file and re-downoad it from the app store.  However, when I do this, Light