Gimp: big files [solution inside]

i try to glue together images to a bigger one (each of them are about 3000x2000px and there are 5 of them)
what i tried:
- open all 5 files in gimp
- create a new image in gimp, with the size 8000x8000px
this worked, but then if i copy one of the smaller images and go to the bigger one and paste it, the system becomes horribly slow and hdd is working a lot --- gimp redraws windows then extremly slowly
i have 768MB ram and only about 300 is occupied while trying this
i think gimp is using only a part of the memory and then has the problem working
why is gimp not using more memory?
also if you know a other method how to glue pics together, i would be glad hearing, as i'm newbie in this sector
what i did as workaround is: i scaled all the pics down to a smaller size and then glued them together. the product you can see here:
http://daperi.home.solnet.ch/uni/bio4/p … logie.html
(click on the first image)
the original resolution is much more and the target is to also use the full resolution to glue them to a much better image
any suggestions welcome
thanx in advance

Dusty wrote:Is File-->prefrences-->environment-->tile cache size what you're looking at?
it was set to 64MB !!! i changed it to 400 and now gimp works again normally also with big files - thanx a lot
Dusty wrote:Another option is to edit smaller files. :-D
this is not an option but a workaround for me, because i need to glue images taken under the microscope, that must keep resolution and size but should be glued together (to keep the details) - with this method, i can construct the whole probe i had a look at the microscope in the computer in one picture, what gives great possibilities in archiving it --- but the trouble is that each pic is about 6*10^6px and if you glue 10 such pics, you need obviously more than 64mb :-)
thanx for helping

Similar Messages

  • I have mountain lion OSX and parallels with Windows.  Every time I click on a hyperlink in other applications it opens - some 7 Zip file explorer inside Parallels but doesn't go to Safari.  Does anyone know solution?

    I have mountain lion OSX and parallels with Windows.  Every time I click on a hyperlink in other applications it opens - some 7 Zip file explorer inside Parallels but doesn't go to Safari.  Does anyone know solution?

    I suggest that you run software update, after which you should have Safari 6.0.5 - then check Safari - Preferences - Privacy & see that 'Block cookies' is not set to Always.
    Failing that - switch Safari extensions Off via Safari - Preferences - Extensions & test again

  • Adobe Photoshop CS3 collapse each time it load a big file

    I was loading a big file of photos from iMac iPhoto to Adobe Photoshop CS3 and it keep collapsing, yet each time I reopen photoshop it load the photos again and collapse again. is there a way to stop this cycle?

    I don't think that too many users here actually use iPhoto (even the Mac users)
    However, Google is your friend. A quick search came up with some other non-Adobe forum entries:
    .... but the golden rule of iPhoto is NEVER EVER MESS WITH THE IPHOTO LIBRARY FROM OUTSIDE IPHOTO.In other words, anything you might want to do with the pictures in iPhoto can be done from *within the program,* and that is the only safe way to work with it. Don't go messing around inside the "package" that is the iPhoto Library unless you are REALLY keen to lose data, because that is exactly what will happen.
    .....everything you want to do to a photo in iPhoto can be handled from *within the program.* This INCLUDES using a third-party editor, and saves a lot of time and disk space if you do this way:
    1. In iPhoto's preferences, specify a third-party editor (let's say Photoshop) to be used for editing photos.
    2. Now, when you right-click (or control-click) a photo in iPhoto, you have two options: Edit in Full Screen (ie iPhoto's own editor) or Edit with External Editor. Choose the latter.
    3. Photoshop will open, then the photo you selected will automatically open in PS. Do your editing, and when you save (not save as), PS "hands" the modified photo back to iPhoto, which treats it exactly the same as if you'd done that stuff in iPhoto's own editor and updates the thumbnail to reflect your changes. Best of all, your unmodified original remains untouched so you can always go back to it if necessary.

  • Rt2860 wifi network hangs on downloading big files.

    After upgrade to 3.2 kernel now my rt2860 pci card is not working properly. Wifi connects fine, and i can browse the internet and download small files. But if i try to download a big file (> 1 gb) over the LAN, it starts and then hangs after downloading 8-10 mb. I have to disconnect the network, and connect again.
    I fixed it by installing rt2860 package from aur: https://aur.archlinux.org/packages.php?ID=14557 and blacklisting 2800pci.
    I would be happy with this solution, but now every time a kernel is updated i loose rt2860 after restart, and i have to manually recompile and instal the aur rt2860 package again.
    Are there any tweaks or config to fix the rt2800pci hanging problem ? or how can i make it not to loose aur package after every kernel upgrade ?

    I do not use rt2800pci on either Arch or Ubuntu. For me, it just doesn't work. You are doing better than I, because I can't get a connection at all with it.
    I put up with the effort of recompiling rt2860 after every kernel update, because it works. Besides, it helps me keep up my chops on compiling and installing kernel modules.
    Tim

  • How to parse a big file with Regex/Patternthan

    I would parse a big file by using matcher/pattern so i have thought to use a BufferedReader.
    The problem is that a BufferedReader constraints to read
    the file line by line and my patterns are not only inside a line but also at the end and at the beginning of each one.
    For example this class:
    import java.util.regex.*;
    import java.io.*;
    public class Reg2 {
      public static void main (String [] args) throws IOException {
        File in = new File(args[1]);
        BufferedReader get = new BufferedReader(new FileReader( in ));
        Pattern hunter = Pattern.compile(args[0]);
        String line;
        int lines = 0;
        int matches = 0;
        System.out.print("Looking for "+args[0]);
        System.out.println(" in "+args[1]);
        while ((line = get.readLine()) != null) {
          lines++;
          Matcher fit = hunter.matcher(line);
          //if (fit.matches()) {
          if (fit.find()) {
         System.out.println ("" + lines +": "+line);
         matches++;
        if (matches == 0) {
          System.out.println("No matches in "+lines+" lines");
      }used with the pattern "ERTA" and this file (genomic sequence) :
    AAAAAAAAAAAERTAAAAAAAAAERT [end of line]
    ABBBBBBBBBBBBBBBBBBBBBBERT [end of line]
    ACCCCCCCCCCCCCCCCCCCCCCERT [end of line]
    returns it has found the pattern only in this line
    "1: AAAAAAAAAAAERTAAAAAAAAAERT"
    while my pattern is present 4 times.
    Is really a good idea to use a BufferedReader ?
    Has someone an idea ?
    thanx
    Edited by: jfact on Dec 21, 2007 4:39 PM
    Edited by: jfact on Dec 21, 2007 4:43 PM

    Quick and dirty demo:
    import java.io.*;
    import java.util.regex.*;
    public class LineDemo {
        public static void main (String[] args) throws IOException {
            File in = new File("test.txt");
            BufferedReader get = new BufferedReader(new FileReader(in));
            int found = 0;
            String previous = "", next = "", lookingFor = "ERTA";
            Pattern p = Pattern.compile(lookingFor);
            while((next = get.readLine()) != null) {
                String toInspect = previous+next;
                Matcher m = p.matcher(toInspect);
                while(m.find()) found++;
                previous = next.substring(next.length()-lookingFor.length());
            System.out.println("Found '"+lookingFor+"' "+found+" times.");
    /* test.txt contains these four lines:
    AAAAAAAAAAAERTAAAAAAAAAERT
    ABBBBBBBBBBBBBBBBBBBBBBERT
    ACCCCCCCCCCCCCCCCCCCCCCERT
    ACCCCCCCCCCCCCCCCCCCCCCBBB
    */

  • How to exchange big files between mac users through the internet?

    How can we send and receive big files between Mac users (apart from Dropbox) ?

    If you aren't on the same local network (which I assume you aren't), then the easiest solution is to use Google.
    If you simply google "sending large files" you will get hundreds of suggestions/applications.
    Ie Dropbox, Google, YouSendIt, FTP, etc

  • Question about reading a very big file into a buffer.

    Hi, everyone!
    I want to randomly load several characters from
    GB2312 charset to form a string.
    I have two questions:
    1. Where can I find the charset table file? I have used
    google for hours to search but failed to find GB2312 charset
    file out.
    2. I think the charset table file is very big and I doubted
    whether I can loaded it into a String or StringBuffer? Anyone
    have some solutions? How to load a very big file and randomly
    select several characters from it?
    Have I made myself understood?
    Thanks in advance,
    George

    The following can give the correspondence between GB2312 encoded byte arrays and characters (in hexadecimal integer expression).
    import java.nio.charset.*;
    import java.io.*;
    public class GBs{
    static String convert() throws java.io.UnsupportedEncodingException{
    StringBuffer buffer = new StringBuffer();
    String l_separator = System.getProperty("line.separator");
    Charset chset = Charset.forName("EUC_CN");// GB2312 is an alias of this encoding
    CharsetEncoder encoder = chset.newEncoder();
    int[] indices = new int[Character.MAX_VALUE+1];
    for(int j=0;j<indices.length;j++){
           indices[j]=0;
    for(int j=0;j<=Character.MAX_VALUE;j++){
        if(encoder.canEncode((char)j)) indices[j]=1;
    byte[] encoded;
    String data;
    for(int j=0;j<indices.length;j++) {
         if(indices[j]==1) {
                encoded =(Character.toString((char)j)).getBytes("EUC_CN");
                          for(int q=0;q<encoded.length;q++){
                          buffer.append(Byte.toString(encoded[q]));
                          buffer.append(" ");
                buffer.append(": 0x");buffer.append(Integer.toHexString(j));
                buffer.append(l_separator);
        return buffer.toString();
    //the following is for testing
    /*public static void main(String[] args) throws java.lang.Exception{
       String str = GBs.convert();
       System.out.println(str);*/

  • Downloding big files on 3g/4g

    hello all
    first of all i like to say i mad at nokia for stoping 3g /4g big downloading over 20meg//
    i have a unlited internet plan sim deal
    but my phone stoping me to use
    ok here we go
    i like nokia to make it so we have a opshon for downloading over 3g big files /on/off system
    first    i understand 3g can be slower then wifi some times and unstabul but it belive it up to the user to use it or not/
    2ed   i understand some ppl use download and go over data so i sogest you set it by defrolt seting to the 20meg on but can be removed by seting.s menu
    now i explane y i a making this post today for the opshon for on/off downloading big files over 20meg not ust desable all the time
    today i went in my car and i got lost yes lost i went to use my phone to get me to a fomila place
    but as i was going to use nokia here maps it ask to download my regon map's i tryed just england on it own it was 200meg and it asked for wifi i had no way of geting wifi i was lost and nokia not alow me to try download on 3g even i got unlimited internet plan
    i belive this opeshon shud be up to the user to use or not
    even if it slow bad downloads crash or data limits / its up to the users// your makeing custermers hate nokia/windowsphones
    thanks keep the peace pzz add your coments
    Solved!
    Go to Solution.

    Nokia hasn't told you to go to the Windows webpage, I have suggested you do that. If you want an answer from Nokia you need to contact Nokia directly as this is a public forum.
    Microsoft control the software so it's them that you need to let know that you are unhappy, it really is that simple. I don't know what exactly can be said here to make you feel better?
    Nokia does pass on user feedback to Microsoft but what is the harm done if you also tell Microsoft yourself? Is it really that difficult to copy and paste your comments on to another site?
    There is nothing further that the users of this forum can say or do to help you on this.

  • Validate and reject checkin for big files???

    Is there a way to validate and reject checkin for big files?
    From client side, sound like, a custom checkin policy won't work. If user overwrite the policy at checkin, then it still open up another "backdoor" for them to checkin.
    From server side, I try TFS plugin but that doesn't work. The CheckinNotification event won't notify after the fact until the checkin already committed and it's already late.
    Any other suggestion?
    Thx.

    Hi Garynguyen, 
    Thanks for your post.
    I think you need create the check-in policy and create server plugin to enforce the check-in policy, please refer to the solution in this article:
    https://binary-stuff.com/post/how-to-enforce-check-in-policies.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How to split one big file in java

    Hi,
    I have some big files ( size about 100MB). I can't use them in my program, and I want to split them to some smaller files. The filesformat is XML, I am using the Xerces Paser.
    Any solution is welcome.
    Thanks
    vq

    What do you mean, split? Of course your parser can't deal with a 100MB XML file, it is building a tree representation in memory and you don't have that much memory. Solution: don't generate files that big.

  • WRT160NL copyin Big files on storage wil not succeed

    When i put a big file like a 7 GB (iso image) on my harddisk which is connected at my router, the connection breaks down in the middle of the copying process. I tried two different hardisks but same problem, so i fugures it has to do with the harddisk. Smaller files are no problem. Is there a timed out setting or something?
    Who can help me?
    Solved!
    Go to Solution.

    Actually 7GB is a large amount of data..However try to change the Wireless Chanel on the router and check..
    Which security type you are using on the router..? It is recommended to use WPA2 Security to get the proper speed from WRT160NL router.

  • Not enough space on my new SSD drive to import my data from time machine backup, how can I import my latest backup minus some big files?

    I just got a new 256GB SSD drive for my mac, I want to import my data from time machine backup, but its larger than 256GB since it used to be on my old optical drive. How can I import my latest backup keeping out some big files on the external drive?

    Hello Salemr,
    When you restore from a Time Machine back up, you can tell it to not transfer folders like Desktop, Documents. Downloads, Movies, Music, Pictures and Public. Take a look at the article below for the steps to restore from your back up.  
    Move your data to a new Mac
    http://support.apple.com/en-us/ht5872
    Regards,
    -Norm G. 

  • Photoshop CC slow in performance on big files

    Hello there!
    I've been using PS CS4 since release and upgraded to CS6 Master Collection last year.
    Since my OS broke down some weeks ago (RAM broke), i gave Photoshop CC a try. At the same time I moved in new rooms and couldnt get my hands on the DVD of my CS6 resting somewhere at home...
    So I tried CC.
    Right now im using it with some big files. Filesize is between 2GB and 7,5 GB max. (all PSB)
    Photoshop seem to run fast in the very beginning, but since a few days it's so unbelievable slow that I can't work properly.
    I wonder if it is caused by the growing files or some other issue with my machine.
    The files contain a large amount of layers and Masks, nearly 280 layers in the biggest file. (mostly with masks)
    The images are 50 x 70 cm big  @ 300dpi.
    When I try to make some brush-strokes on a layer-mask in the biggest file it takes 5-20 seconds for the brush to draw... I couldnt figure out why.
    And its not so much pretending on the brush-size as you may expect... even very small brushes (2-10 px) show this issue from time to time.
    Also switching on and off masks (gradient maps, selective color or leves) takes ages to be displayed, sometimes more than 3 or 4 seconds.
    The same with panning around in the picture, zooming in and out or moving layers.
    It's nearly impossible to work on these files in time.
    I've never seen this on CS6.
    Now I wonder if there's something wrong with PS or the OS. But: I've never been working with files this big before.
    In march I worked on some 5GB files with 150-200 layers in CS6, but it worked like a charm.
    SystemSpecs:
    I7 3930k (3,8 GHz)
    Asus P9X79 Deluxe
    64GB DDR3 1600Mhz Kingston HyperX
    GTX 570
    2x Corsair Force GT3 SSD
    Wacom Intous 5 m Touch (I have some issues with the touch from time to time)
    WIN 7 Ultimate 64
    all systemupdates
    newest drivers
    PS CC
    System and PS are running on the first SSD, scratch is on the second. Both are set to be used by PS.
    RAM is allocated by 79% to PS, cache is set to 5 or 6, protocol-objects are set to 70. I also tried different cache-sizes from 128k to 1024k, but it didn't help a lot.
    When I open the largest file, PS takes 20-23 GB of RAM.
    Any suggestions?
    best,
    moslye

    Is it just slow drawing, or is actual computation (image size, rotate, GBlur, etc.) also slow?
    If the slowdown is drawing, then the most likely culprit would be the video card driver. Update your driver from the GPU maker's website.
    If the computation slows down, then something is interfering with Photoshop. We've seen some third party plugins, and some antivirus software cause slowdowns over time.

  • How to make an applet to read the Text file present inside a jar

    Hi All,
    I have writen one applet named ReadFile.java which reads a text file present in the same directory and does some manipulation of text file contents.
    The applet code runs successfully when i run the applet in command prompt like
    {color:#ff0000}*java ReadFile*{color}
    And i am getting the needed results.
    Then i made a jar file with the applet code and text file as
    {color:#ff0000}*jar cvf rf.jar ReadFile.class File1.txt*{color}
    Then i have inlcuded this applet inside a html file as
    {color:#ff0000}*<applet code= "ReadFile.class" width= "500" height= "300" archive = "rf.jar" ></applet>*{color}
    after this when i load the html file, the applet code is not executed fully. Its throwing FileNotFoundException: File1.txt.
    Applet is not recognizing trhe text file present inside the jar file.
    Can any body explain me how to overcome this problem. Any setting needs to be done for making the applet indicate the presence of Text file inside the jar file.

    what code in your applet gets the text file and reads it? are you using getResource or something similar?

  • I've doubled my RAM but still can't save big files in photoshop...

    I have just upgraded my Ram from the shipped 4GB to 8GB but I still can't save big images in photoshop. It seems to have made no difference to the speed of general tasks and saving and I can't save big files in photoshop at all. I already moved massive amounts of files off my computer and onto my external hard drive, so there is much less in my computer now, and twice the RAM but it is making no noticeable difference. When I click memory, under 'about my mac' it shows that the RAM is installed and now has twice the memory. Could this be something to do with photoshop? I'm running CS6.

    Also, I just calculated 220 cm into inches and this is roughly 86.5 inches. Over 7 feet in length.
    WIth an image that large, you may want to consider down sampling to a lower DPI to help with being able to work on and process images of this size much easier and it will be easier for your Print house to process, as well.
    You might want to consider working with these rather large images at 225 DPI instead of 300 if resolution at close distances is still a concern.
    Or what you good try is working with the images at 300 DPI, but then save them/export them as a Jpeg image at the highest image quality setting.
    I do a lot of project where I use a high resolution jpeg image to save on image proceesing overhead.
    The final printed images still come out pretty clear, clean and crisp without having to process those large files at a print house or having to deal with using that full resolution image in a page layout or illustration program.

Maybe you are looking for

  • Error 8 when starting the extraction program

    HI all I have the following problem when i start an extraction. "Error 8 when starting the extraction program" I test the Extraction program on the R/3 system(source system) and it works, It displays records. On the target system in the infpk under s

  • S.M.A.R.T. Status - Failing

    Hello, I recently found that my iSight camera wasn't being detected. In the System Profiler, it's identified as a Vendor-Specific Device. After some digging on the web, I found a suggestion to repair my disc permissions. I went to Disk Utility to go

  • Getting a list of hyperlinks in a PDF

    I just finished a 600 page project with hyperlinks scattered throughout. (Originally done in InDesign.) Just wondering if there is some function in Acrobat Pro 9 that will give you a hyperlink count in a document. I found the function where you can e

  • A bit disappointed in Apple

    I upgraded the hardware in my older (2006) iMac sometime ago. Because of that, it has all the hardware requirements to run either Lion or Mavericks OS. But, Apple doesn't allow allow any of the OS updates just because of the model #. The word "Apple"

  • MIDI sync dropping btw Logic8 & Doepfer MAQ 16/3

    My Doepfer MAQ just will not stay in sync with Logic 8. using a MOTU Fastlane as the MIDI interface. This is the first time the Doepfer will not stay in sync with something. Any suggestions? Thanks