Is illustrator running slow because of large file size?

Is my illustrator running slow? I have an i7 3770K, 16gb ram, gtx 660i 2g, h100i, ax1200i and maximus formula v. I'm running dual 24" 1920x1080 monitors. File size is 800x600 pixals with 5 layers and 100+ paths in each layer. Is the program running slow because the file is too big or is there a problem with illustrator?

If you have a internal HD available, you can select it as a scratch disk...
the above dialog is from CS3, but you have something similar in your Preferences file.
If you are running out of disk space, then there's not enough scratch to perform certain ops efficiently and it could slow things down considerably.  Just a thought.

Similar Messages

  • Need to get large file sizes down to downloadable sizes-how?

    I am using Compressor 4 and have several professional videos I produced with Final Cut Studio in QuickTime/.MOV format using ProRes 442. They are all 720p and of course, all have large file sizes. I needed to convert them so they are compatible with iTunes as well as Apple devices (AppleTV, iPad/iPod/iPhone, etc). They also need to be downloadable in a reasonable time and I of course want to maintain a quality image. Several of the videos are about an hour long, some a little more.
    I an unable to get most of these files to be small enough for practical download and still maintain an acceptable quality. For example:
    I have one video (QuickTime rendered from FCPS) that runs 54 minutes. 1280x720, ProRess 422, Linear PCM with a Bit rate of 72.662. The best I seem to be able to get out of compressor is to use the setting options for "HD for Apple Devices (5 Mbs)" which gives me a file that's just over 2Gb. Not bad, and the quality is still very good, but the file size is just too big for people to download. I get reports of people getting estimates of 77 hours to download this one file.
    I did try using the setting under Video Sharing Services / Large 540p Video Sharing just to see what it would give me, but it really wasn't that much smaller and the quality was noticeably degraded.
    I'm confused as to what to do about this. I have downloaded things from iTunes before that are good quality, about an hour and their file sizes are simply bnot this big.
    Can anyone give me some advice? Thank you. Grateful for anything you can offer.
    MC

    If you're not concerned with precise average bitrate, use h.264 codec, set bitrate to "automatic" and use the quality slider. Be sure to have "keyframes" on "automatic"; every 24 frames or so makes h.264 quite inefficient and is only useful for live-streaming, where you want people to be able to resynchronise as quickly as possible. Having multi-pass enabled might help, but I'm not sure what it does when using constant-quality encoding.
    Do a test-encoding of a representative clip of maybe about rwo minutes or so, play with the quality slider and see how far down you can go and still be satisfied with the quality. Using "automatic" bitrate uses as many bits to encode each frame as is necessary to get the required quality, and file sizes will vary depending on the type of material. Complex textures and high motion will use more bits, still frames and slow pans will use very little.
    A light chroma-channel denoising may help improve perceived quality, as may very light sharpening in case you scaled down the material.
    If you're still not happy with the bitrate vs. quality tradeoff, try reducing the resolution, but generally you should get 720p down to around 2.5 to 3.0 Mbps with adequate quality.
    On the other hand, If someone needs 77 hours to download 2 GB, they have another problem anyway. That sounds like a 56k modem connection. What's he doing trying to download HD content anyway?
    Bernd

  • Facebook application grow large file size

    Why facebook applications grow larger and larger file size. What can I do. Since this program is to use the file size is big time. I use facebook version 4.0.3 on iphone4 run on iOS 5.0.1.
    thank you.
    Eak.

    Most of the logs rotation for the different componets can be set in EM console but if you just want to do it with a cron or script here is what I have used in the past. Edit it to your needs.
    #!/bin/sh
    ORACLE_HOME=/home/oracle10FRS
    apache_logs=$ORACLE_HOME/Apache/Apache/logs
    webcache_logs=$ORACLE_HOME/webcache/logs
    sysman_logs=$ORACLE_HOME/sysman/log
    j2ee_logs=$ORACLE_HOME/j2ee/OC4J_BI_Forms/log/OC4J_BI_Forms_default_island_1
    also_j2ee=$ORACLE_HOME/j2ee/home/log/home_default_island_1
    max_size=1500000000
    for file in `ls $apache_logs/*_log`
    do
         size=`ls -l $file | awk '{print $5}'`
         echo "File $file is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$file
         fi
    done
    for wfile in `ls $webcache_logs/*_log`
    do
         size=`ls -l $wfile | awk '{print $5}'`
         echo "File $wfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$wfile
         fi
    done
    for sfile in `ls $sysman_logs/*log`
    do
         size=`ls -l $sfile | awk '{print $5}'`
         echo "File $sfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$sfile
         fi
    done
    for jfile in `ls $j2ee_logs/*.log`
    do
         size=`ls -l $jfile | awk '{print $5}'`
         echo "File $jfile is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$jfile
         fi
    done
    for j2file in `ls $also_j2ee/*.log`
    do
         size=`ls -l $j2file | awk '{print $5}'`
         echo "File $j2file is $size bytes"
         if [ "$size" -ge "$max_size" ]
         then
              echo "That's too big. Truncating"
              >$j2file
         fi
    done

  • Does FW CS5 Mac crap out w large file sizes?

    Does FW CS5 Mac crap out with large file sizes like CS4 did? It seems like files over a 4-5MB tend to be flaky... Is there an optimum file size? I'm running a Mac Tower 2.93 GHz Quad Core, w 8 GB RAM

    Why not download the trial version and find out?

  • Creating .pdf from a ppt or .doc in acrobat 9 resulting in strangely large file sizes

    I am printing to .pdf from Microsoft Word and Powerpoint (previously Office 07 - now Office 10) and the file sizes are sometimes doubling in size. Also, for some larger documents, multiple .pdfs are generating as if acrobat is choosing where to cut off a document due to large file size and generating secondary, tertiary, etc., docs to continue the .pdf. The documents have placed images (excel charts, etc) but they are not enormous - that is until they are being generated into a .pdf. I had hoped that when i upgraded to Office 10, this issue would go away, but unfortunately, it hasn't. Any ideas? thoughts? Thanks so much!

    One thing that can happen with MS Office applications when an image contains transparency is the "image" is actually composed of multiple one pixel images when converted to PDF. This could be confirmed by examining the file. Try selecting the images using the TouchUp Object tool to see if what looks like a single image it entirely selected.

  • Video Stalls - Larger File Size

    Just downloaded Battlestar Galactica season 2 and the video stalls at full screen. Do not have this problem with previous Battlestar Galactica shows and movies that I have downloaded in the past.
    I notice the File Size and Total Bit Rate of the new shows are three times greater in size than the shows I've downloaded in the past.
    Using the utility ativity monitor, I notice I'm maxing out my CPU when I go to full screen which I think is causing my video stall.
    I also notice that now my previous TV show downloads don't fully fill my screen like that did before the new downloads.
    I hope someone might know how I can get my TV shows back to working normal at full screen and why the new TV shows file sizes are larger than before.
    Thanks for Your Help
    Ricky
    PowerBook G4 15" LCD 1.67GHz   Mac OS X (10.3.9)  

    Thanks for the reply. Shortly after I posted, I started to relize that Apple was offering better TV Show quality which results in larger file sizes. I also went to my Energy Saver section in System Prefrences and found my Optimize Energy Settings was set for Presentations. I changed it to Highest Performance which resolved the issue.
    iTunes 7.0.2
    Quicktime 7.1.3
    RAM 512MB
    Thanks for your help.

  • Aperture is exporting large file size e.g. original image is 18.2MB and the exported version (TFF 16 Bit) is 47.9MB, any ideas please

    Aperture is exporting large file size e.g. original image is 18.2MB and the exported version (TFF 16 Bit) is 47.9MB, any ideas please

    Raws, even if not compressed, Sould be smaller than a 24-bit Tiff, since they have only one bitplane. My T3i shoots 14 bit 18MP raws and has a raw file size of 24.5 MB*. An uncompressed Tiff should have a size of 18 MP x 3 bytes per pixel or 54 MB.
    *There must be some lossless compression going on since 18 MP times 1.75 bytes per pixel is 31.5MB for uncompressed raw.

  • Saving jpeg and png files large file size

    Ive recently purchased web premium 5.5 and Im trying to save files in jpeg and png format in Photoshop and Im getting unexpectedly large file sizes, for example I save a simple logo image 246x48 which consists of only 3 basic colours and when I save it as a jpeg quality 11 it reports the file size in the finder as 61.6k, I save it as a png file and its 60.2k also I cropped the image to 190x48 and saved it as a png and the file size is actually larger at 61.9k saving as a jpeg at quality 7 the files size is a still relatively large 54k.
    I have a similar png non indexed colour logo on my mac I saved in CS3 on a pc which is actually larger at 260x148 and it's only 17k and that logo is actually more complex with a gloss effect over part of it. Whats going on and how do I fix it, It's making Photoshop useless especially when saving for the web
    Thanks

    Thanks I had considered that but all my old files are reporting the correct files sizes I have been experimenting and fireworks saves the file at png 24 at 2.6k and jpegs at 5.1k, but I don't really want to have to save the files twice once cut from the comp in photoshop and again in fireworks juggling between the two applications is a bit inconvenient even with just one file but especially when you have potentially hundreds of files to save.
    Ive also turned off icon and windows thumbnail in photoshop preferences and although this has decreased the file size they are still quite large at 27k and save for the web is better at 4k for the png and 16k for the jpeg. Is there anyway to get Fireworks file saving performance in Photoshop ? it seems strange that the compression in Photoshop would be second rate in comparison to fireworks given they are both developed by Adobe and Photoshop is Adobes primary image editing software.

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Spot Healing Brush Content-Aware SLOW! on large files

    I updated from Photoshop CC to CC 2014. When working with ultra large format scans (more than 3 Gigapixel) the Spot Healing Brush set on Content-Aware takes unusually long (5-7 seconds for retouching small dust spots). In Photoshop CC the same Content-Aware Spot Healing Brush actions were executed instantly without any perceived delays once I've pointed a source for the Clone Stamp tool (which I found weird but doing so really made the Spot Healing Brush faster on large files). On smaller images it's still very fast on CC 2014 but not on the 3.5 GPixel images I'm working on at the moment.
    I know there have been some changes/improvements for the Content-Aware tools but in this case it's seriously hitting the performance. I haven't noticed any other slow-downs in processing performance with the update.
    I'm working on a dual-CPU (2 x 2.4 GHz 6-core) and 128 GB RAM Windows 8.1 machine. I've allowed Photoshop to use up to 120 GB RAM.
    Is there any way to reset the Content-Aware Spot Healing Brush to work as fast as it had before the CC 2014 update?

    My performance settings are the same for CC and CC 2014.
    I assume the problem is introduced by the changes/improvements of the Content-Aware tools and on normal file sizes it might not matter for the user if the operation takes 25 or 200 ms. But on large files it seriously slows down the retouching work.
    So far I've been doing as you suggested, using CC for that type of work. But as it takes about 90 minutes to save these huge files and AFAIK Photoshop CC and CC 2014 cannot be run as separate instances simultaneously it's quite inconvenient not being able to use CC 2014 for some other type of work while CC is saving the file.
    If the new Content-Aware functions take that much more processing it would be great to have a choice (similar to the choice of the RAW-engine used in ACR). However, if the new Content-Aware functions are just not yet optimized for fast performance then it's a bug.
    I will file a Feature Request/Bug Report and ask if there is anything that can be done about it in Photoshop CC 2014.
    Thanks for your help, Eugene.

  • Slow Performance or XDP File size very large

    There have been a few reports of people having slow performance in their forms (tyically for Dynamic forms) or file sizes of XDP files being very large.
    These are the symptoms of a problem with cut and paste in Designer where a Process Instruction (PI) used to control how Designer displays a specific palette is repeated many many times. If you look in your XDP source and see this line repeated more than once then you have the issue:
    The problem has been resolved by applying a style sheet to the XDP and removing the instruction (until now). A patch has been released that will fix the cut and paste issue as well as repair your templates when you open them in a designer with the patch applied.
    Here is a blog entry that describes the patch as well as where to get it.
    http://blogs.adobe.com/livecycle/2009/03/post.html

    My XDP file grow up to 145mb before i decided to see what was actually happening.
    It appears that the LvieCycle Designer ES program sometimes writes alot of redundant data... the same line millions of times over & over again.
    I wrote this small java program which reduced the size up to 111KB !!!!!!!!!!!!!!!!!! (wow what a bug that must have been!!!)
    Here's the sourcecode:
    import java.io.BufferedReader;
    import java.io.BufferedWriter;
    import java.io.FileNotFoundException;
    import java.io.FileReader;
    import java.io.FileWriter;
    import java.io.IOException;
    public class MakeSmaller {
    private static final String DELETE_STRING = "                           <?templateDesigner StyleID aped3?>";
    public static void main(String... args) {
      BufferedReader br = null;
      BufferedWriter bw = null;
      try {
       br = new BufferedReader(new FileReader(args[0]));
       bw = new BufferedWriter(new BufferedWriter(new FileWriter(args[0] + ".small")));
       String line = null;
       boolean firstOccurence = true;
       while((line = br.readLine()) != null) {
        if (line.equals(DELETE_STRING)) {
         if (firstOccurence) {
          bw.write(line + "\n");
          firstOccurence = false;
        } else {
         bw.write(line + "\n");
         firstOccurence = true;
      } catch (FileNotFoundException e) {
       e.printStackTrace();
      } catch (IOException e) {
       e.printStackTrace();
      } finally {
       if (br != null) {
        try {
         br.close();
        } catch (IOException e) {
         e.printStackTrace();
       if (bw != null) {
        try {
         bw.close();
        } catch (IOException e) {
         e.printStackTrace();
    File that gets generated is the same as the xdp file (same location) but gets the extension .small. Just in case something goes wrong the original file is NOT modified as you can see in the source code. And yes Designer REALLY wrote that line like a gazillion times in the .xdp file (shame on the programmers!!)
    You can also see that i also write the first occurrence to the small file just in case its needed...

  • Slow searches in large file ...

    It is almost unusable to use the find function as is in my large file, as the program won't let me complete my text in the search field once it starts looking for the first string of text i type. I have enabled the whole words option, but this doesn't help much. I tried to use the "use selection for find" option in the menu, but this is always disabled, so I'm not sure of this is of any use...
    Any ideas how to get search to wait until I type in the whole text field before it starts to bog down??
    Thanks
    jcm

    Thanks for the reply.
    File is 7 Megs, 1000 rows, columns to BB.
    I don't mind that the search is slow, but e.g. when I want to search for "johnson", it stops me from typing after jo, then again after john, then again after johns etc etc, .... Best workaround so far is to type Johnson into a blank cell somewhere, then paste it into the search field...but if this is the answer, it's back to Excel.......

  • CFB slowing up on large files

    G'day
    This is just an observation more than anything else.
    The performance of the release version of CFB has been much better than the beta versions, which is heartening.  During the beta I had to have as much background functionality as possible switched off, eg: workspace rebuilding, syntax checking, etc to even get work done, but on the release version I've been able to leave everything on, and it's been OK thusfar.
    I'm in the process of writing a CFC which has started to be rather large (I might need to do some refactoring, but I'll get onto that a bit further down the track...).  CFB seemed to handle it fine up until I was at about 1400-1500 lines, but I've now tipped 2000 lines (92kB) and CFB is beginning to struggle.  Syntax highlighting is so slow now it's always wrong for a while, until CFB catches up, even data entry seems to have slowed up - I guess this is due to syntax checking or something - but I am typing faster than CFB can render the characters I've typed.  I'm a reasonably fast typer, but when I'm coding, I'm not spectacularly fast.  Sometimes I need to pause to think what I'm doing ;-)
    Unfortunately I do not have CFEclipse installed on this machine to compare performance.
    Now... I guess I expect the thing to slow up eventually as files grow, but as it currently stands I'm expecting to start getting frustrated with having to wait around for CFB to catch up with me, if it slows down much more.  I speculate this file will grown to be another 500-odd lines long yet.
    Adam

    G'day
    This is just an observation more than anything else.
    The performance of the release version of CFB has been much better than the beta versions, which is heartening.  During the beta I had to have as much background functionality as possible switched off, eg: workspace rebuilding, syntax checking, etc to even get work done, but on the release version I've been able to leave everything on, and it's been OK thusfar.
    I'm in the process of writing a CFC which has started to be rather large (I might need to do some refactoring, but I'll get onto that a bit further down the track...).  CFB seemed to handle it fine up until I was at about 1400-1500 lines, but I've now tipped 2000 lines (92kB) and CFB is beginning to struggle.  Syntax highlighting is so slow now it's always wrong for a while, until CFB catches up, even data entry seems to have slowed up - I guess this is due to syntax checking or something - but I am typing faster than CFB can render the characters I've typed.  I'm a reasonably fast typer, but when I'm coding, I'm not spectacularly fast.  Sometimes I need to pause to think what I'm doing ;-)
    Unfortunately I do not have CFEclipse installed on this machine to compare performance.
    Now... I guess I expect the thing to slow up eventually as files grow, but as it currently stands I'm expecting to start getting frustrated with having to wait around for CFB to catch up with me, if it slows down much more.  I speculate this file will grown to be another 500-odd lines long yet.
    Adam

  • Is it best to upload HD movies from a camcorder to iMovie or iPhoto.  iMovie gives the option for very large file sizes - presumably it is best to use this file size because the HD movie is then at its best quality?

    Is it best to upload hd movie to iPhoto or iMovie?  iMovie seems to store the movie in much larger files - presumably this is preferable as a larger file is better quality?

    Generally it is if you're comparing identical compressors & resolutions but there's something else happening here.  If you're worried about quality degrading; check the original file details on the camera (or card) either with Get Info or by opening in QuickTime and showing info. You should find the iPhoto version (reveal in a Finder) is a straight copy.  You can't really increase image quality of a movie (barring a few tricks) by increasing file size but Apple editing products create a more "scrub-able" intermediate file which is quite large.
    Good luck and happy editing.

  • Large file size and slow pagination when form filled out in Reader 7

    Hi all,
    I have designed a form with Livecycle Designer 8. For compatibility, I have made it an Acrobat 7 (Dynamic) XML Form. The form itself is 18 pages long and when filled in with Reader 8, pagination is quick and the file size only jumps about 100KB. However, when I fill in the form with Reader 7, it takes about 10-20 seconds to repaginate the form each time a field is filled in and the filled in form size jumps to about 6MB!
    Can anyone explain this behavior and tell me if there is a way that I can work around this for my clients using Reader 7?
    Thanks.

    Hi,
    Performance of forms was a known problem in Reader / Acrobat 7. There were afew things you can do to help but it can't be fixed other than by upgrading to version 8.
    This document contains afew ideas to help with form performance:
    http://partners.adobe.com/public/developer/en/pdf/lc_designer_perf_guidelines.pdf
    There also used to be an Adobe document around for version 7 which stated that 8 pages was the practical maximum for form sizes. You could go a bit bigger than this, but the performance would degrade pretty fast.
    As Jared stated before, for large forms, moving to version 8 is about your only realistic option.
    Regards
    Anthony Jereley
    Indigo Pacific
    www.indigopacific.com

Maybe you are looking for