Large File Encryption is Too Slow!! Help

I am trying to write some methods to handle encryption of zip files, most are very large (over 200 mb). At first it was attempted to load the file into memory, create a Base64encoded string and encrypt that back out to a file. But not nearly enough memory to do that, and generally as files get larger, it would not be possible to support that. So instead I am opting for something like CipherOutputStreams, read in, write out without buffer. on Small files, this is great (smaller than 1 mb) as it happens instantly. But when it get up to the 200MB minimum size of our zips, it takes hours. Is there any way I can speed this up? (See code below)
import java.util.*;
import java.io.*;
import java.util.zip.*;
import javax.crypto.*;
import javax.crypto.spec.*;
public class EncryptedZipUtil
  private static final String ALGO = "DESede";
  private static final String CIPHER_ALGO = "DESede/CBC/PKCS5Padding";
  private static final byte[] IV_PARAMS = new byte[] {
          127, 111, 13, 120, 12, 34, 56, 78
  private static final String ENCODING = "UTF-8";
  private static final byte[] ENCRIPTION_KEY = "Encryption key must be at least 30 characters long".getBytes();
  private static Cipher getCipher(int mode) throws Throwable
      DESedeKeySpec spec = new DESedeKeySpec(ENCRIPTION_KEY);
      SecretKeyFactory keyFactory = SecretKeyFactory.getInstance(ALGO);
      SecretKey theKey = keyFactory.generateSecret(spec);
      Cipher dcipher = Cipher.getInstance(CIPHER_ALGO);
      IvParameterSpec IvParameters = new IvParameterSpec(IV_PARAMS);
      dcipher.init(mode, theKey, IvParameters);
      return dcipher;
  public static ZipInputStream getDecryptedZipStream(File f) throws SecurityException, IOException
    FileInputStream fin = new FileInputStream(f);
    return new ZipInputStream(new CipherInputStream(fin,getCipher(Cipher.DECRYPT_MODE)));
  public static void streamEncryptZipFile(File zipFile, File outzip) throws Throwable
    FileInputStream fin = new FileInputStream(zipFile);
    CipherOutputStream cos = new CipherOutputStream(new FileOutputStream(outzip),getCipher(Cipher.ENCRYPT_MODE));
    while(fin.available() != 0)
      cos.write(fin.read());
    fin.close();
    cos.close();
  public static void streamDecryptZipFile(File e_zipFile, File outzip) throws Throwable
    CipherInputStream cin = new CipherInputStream(new FileInputStream(e_zipFile),getCipher(Cipher.DECRYPT_MODE));
    FileOutputStream fos = new FileOutputStream(outzip);
    int the_byte = -1;
    while((the_byte = cin.read()) != -1)
      fos.write(the_byte);
    cin.close();
    fos.close();
  public static void main(String[] args) throws Throwable
    EncryptedZipUtil.streamEncryptZipFile("D:\\ziptest\\original.zip","D:\\ziptest\\encrypted.zip");
    EncryptedZipUtil.streamDecryptZipFile("D:\\ziptest\\encrypted.zip","D:\\ziptest\\decrypted.zip");
}Like I said, the code here works great with small files. Quick, correct and simple. My Machine is 2.8ghz P4, with 1024mb Ram, and when I run this on my larger files, it uses 99% CPU and no memory in addition to the standard JVM Heap.
Any advice would be greatly appreciated

Nice answer, but for additional duke points I've though of an issue with it: although this defeintly speeds up encrypt/decrypt time, there is no longer any way for me to access this file as a stream directly: if you see the method getDecryptedZipStream(File f), since the file is generally a zip, getting a stream in this manner wont work if we want to thumb through the contents and extract entries on the fly, because now the encryption is done in chunks rather than by bytes which the underlying stream would be should we want to browse the zip. So this solution doesn't solve all aspects of the issue. My alternative was then to imagine that I would enncrypt the individual contents of the zips prior to zipping and leave the zip intact. I could the browse the zip but can not access any of the data without decrypting each file first. I don't know if that is the best solution as of yet.

Similar Messages

  • StringBuffer delete (int start, int end) is too slow - HELP!!

    I've traced a performance problem to one line of code: a StringBuffer "delete (int start, int end)" call.
    My StringBuffer contains a lot of data, maybe 500,000 characters. My code loops, searching for a particular character (the search is fast, btw). When that particular character is found, I want to delete that character plus the characters following it. My code looks something like:
    int pos;
    while ((pos = myStringBuffer.toString().indexOf('~')) != -1)
         myStringBuffer.delete(pos, pos + 3);I'm 100% sure the bottleneck is the delete call. If I change the code to the following, where I've commented out the delete, it loops the same number of times and runs lightning-fast:
    int pos = 0;
    while ((pos = myStringBuffer.toString().indexOf('~', pos)) != -1)
         pos++;The loop, and therefore the delete method, will be executed thousands of times. To give you some idea of the poor performance I'm seeing, the first code (with the delete) takes more than 10 minutes to run, while the second code (without the delete) takes less than one-tenth of one second.
    Any suggestions? Thank you.

    Thanks for all the help! da.futt's solution works perfectly: 65,000 times through the loop in about one-tenth of one second. Problem solved.
    We're still on Java 1.3.1 (large company = slow to upgrade), so I can't even remove toString(). I wonder how many "small" performance problems we have (and are unaware of) that could be addressed by upgrading?
    My first posting here and a positive result. I'm not a Java expert, but neither am I a rookie. I'll be back to try and help others as I've been helped here.
    Thanks again.

  • Safari too slow - help finding a bug?

    Hi all,
    Sorry for such a noob question, this is a problem I never have myself. My wife's iMac (10.4) is running Safari extremely slow. This occurred when we were out of town and our inlaws were house-sitting.
    My inlaws have been known to (inadvertently) change settings and download things they should not. (Flash players or radio listening programs, etc). I'm betting something similar happened, but couldn't find anything in the surfing history since it was too much to go through. Other ideas?
    Ran Onyx and all that is clean, btw. I've never needed any other anti-malware software myself...
    (As a sidenote, would upgrading to 10.5 make a difference?) I have an unopened copy of Leopard - Server (I don't need the server part) at my disposal. What needs to happen for that?
    Thanks so much,
    -josh

    HI,
    Try uninstalling then reinstalling a new copy of the Adobe Flash Player plugin.
    How to Uninstall Flash plug in
    How to reinstall Flash plug in
    Try Safari maintenance also.
    
From the Safari Menu Bar, click Safari / Empty Cache.
    When you are done with that...

from the Safari Menu Bar, click Safari / Reset Safari. Select the top 5 buttons and click Reset.
    Mac OS: Web Browser Quits Unexpectedly or Stops Responding
    Safari add-ons can cause performance issues or other situations
    

Repair disk permissions:

Quit any open applications/programs. Launch Disk Utility. (Applications/Utilities) Select MacintoshHD in the panel on the left, select the FirstAid tab. Click: Repair Disk Permissions. When it's finished from the Menu Bar, Quit Disk Utility and restart your Mac. If you see a long list of "messages" in the permissions window, it's ok. That can be ignored. As long as you see, "Permissions Repair Complete" when it's finished... you're done. Quit Disk Utility and restart your Mac.
    *As for Leopard... minimum requirements are as follows:*
    an Intel processor or a PowerPC G4 or G5 processor
    a DVD drive
    built-in FireWire
    at least 256 MB of RAM for a PowerPC based Mac, and 512 MB for an Intel-based Mac
    a built-in display or a display connected to an Apple-supplied video card supported by your computer
    at least 6 GB of disk space available, or 8 GB if you install the developer tools
    Upgrading from one Mac OS X to another does not necessarily "fix" bugs. It's better to trouble shoot and make sure the drive is ok before upgarding. You can boot from the install disk that came with the iMac to do that.
    Insert Installer disk and Restart, holding down the "C" key until grey Apple appears.
    Go to Installer menu and launch Disk Utility.
    Select your HDD (manufacturer ID) in the left panel.
    Select First Aid in the Main panel.
    *(Check S.M.A.R.T Status of HDD at the bottom of right panel. It should say: Verified)*
    Click Repair Disk on the bottom right.
    If DU reports disk does not need repairs quit DU and restart.
    If DU reports errors Repair again and again until DU reports disk is repaired.
    When you are finished with DU, from the Menu Bar, select Utilities/Startup Manager.
    Select your start up disk and click Restart
    While you have the Disk Utility window open, look at the bottom of the window. Where you see Capacity and Available. *Make sure there is always 10% to 15% free disk space*
    As for Leopard Server:
    Mac OS X Server is a UNIX server operating system from Apple. The platform is based on the same architecture as Mac OS X, but includes additional services, applications, and administration tools for managing users and services and for deploying servers and clients. The server operating system is included on Xserve, a rack mount server designed by Apple. It is also available preinstalled on the Mac mini and Mac Pro and is sold separately for use on any Macintosh computer meeting its minimum requirements. *Mac OS X Server is commonly found in small business, education, and large enterprise organizations.*
    Carolyn

  • IR Report found 1 million record with blob files performance is too slow!

    we are using
    oracle apex 4.2.x
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - Production
    mod_plsql with Apache
    Hardware: HP proliant ML350P
    OS: WINDOWS 2008 R2
    customized content management system developed in apex.when open the IR report have 1 ml rows found and each rows have blob(<5MB as pdf/tiff/bmp/jpg) it will be raising rows in future! but the searching performance is very slow!
    how can increasing the performance?
    how can showing progressing status to user while searching progress going on IR report itself?
    Thanx,
    Ram

    It's impossible to make definitive recommendations on performance improvement based on the limited information provided (in particular the absence of APEX debug traces and SQL execution plans), and lacking knowledge of the application  requirements and access to real data.
    As noted above, this is mainly a matter of data model and application design rather than a problem with APEX.
    Based on what has been made available on apex.oracle.com, taking action on the following points may improve performance.
    I have concerns about the data model. The multiple DMS_TOPMGT_MASTER.NWM_DOC_LVL_0x_COD_NUM columns are indications of incomplete normalization, and the use of the DMS_TOPMGT_DETAILS table hints at an EAV model. Look at normalizing the model so that the WM_DOC_LVL_0x_COD_NUM relationship data can be retrieved using a single join rather than multiple scalar subqueries. Store 1:1 document attributes as column values in DMS_TOPMGT_MASTER rather than rows in DMS_TOPMGT_DETAILS.
    There are no statistics on any of the application tables. Make sure statistics are gathered and kept up to date to enable the optimizer to determine correct execution plans.
    There are no indexes on any of the FK columns or search columns. Create indexes on FK columns to improve join performance, and on searched columns to improve search performance.
    More than 50% of the columns in the report query are hidden and not apparently used anywhere in the report. Why is this? A number of these columns are retrieved using scalar subqueries, which will adversely impact performance in a query processing 1 million+ rows. Remove any unnecessary columns from the report query.
    A number of functions are applied to columns in the report query. These will incur processing time for the functions themselves and context switching overhead in the case of the non-kernel dbms_lob.get_length calls. Remove these function calls from the query and replace them with alternative processing that will not impact query performance, particularly the use of APEX column attributes that will only apply transformations to values that are actually displayed, rather than to all rows processed in the query.
    Remove to_char calls from date columns and format them using date format masks in column attributes.
    Remove decode/case switches. Replace this logic using Display as Text (based on LOV, escape special characters) display types based on appropriate LOVs.
    Remove the dbms_lob.get_length calls. Instead add a file length column to the table, compute the file size when files are added/modified using your application or a trigger, and use this as the BLOB column in the query.
    Searching using the Search Field text box in the APEX interactive report Search Bar generates query like:
    select
    from
      (select
      from
        (...your report query...)
      ) r
      where ((instr(upper("NWM_DOC_REF_NO"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("NWM_DOC_DESC"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("SECTION_NAME"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("CODE_TYPE"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("REF_NUMBER_INDEX"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("DATE_INDEX"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("SUBJECT_INDEX"), upper(:apxws_search_string_1)) > 0
      or instr(upper("NWM_DOC_SERIEL"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("NWM_DOC_DESCRIPTION"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("NWM_DOC_STATUS"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("MIME_TYPE"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("NWM_DOC_FILE_BINARY"), upper(:APXWS_SEARCH_STRING_1)) > 0 ))
      ) r
    where
      rownum <= to_number(:APXWS_MAX_ROW_CNT)
    This will clearly never make use of any available indexes on your table. If you only want users to be able to search using values from 3 columns then remove the Search Field from the Search Bar and only allow users to create explicit filters on those columns. It may then be possible for the optimizer to push the resulting simple predicates down into the inlined report query to make use of indexes on the searched column.
    I have created a copy of your search page on page 33 of your app and created an After Regions page process that will create Debug entries containing the complete IR query and bind variables used so they can be extracted for easier performance analysis and tuning outside of APEX. You can copy this to your local app and modify the page and region ID parameters as required.

  • USB of my iMac 3G Too slow HELP!!

    I received and old imac 3g from my brother. Now I'm using it as a "media center" for my apple TVs and Macbook. It was connected to a HD of 500Gb via firewire, but it got full. So I replace it with a HD of 1Tb (Westbook), but via USB. The iMac Reads it but it take like 5 minutes to recognize it, it works well streaming to my Apple TV, but it doesn't work via itunes to my MacBook. Another not: I can't connect any new Ipod to the USB, it says that it needs a faster blablabla...
    Can I change my USB or I'm forced to reback up my library and buy another HD?
    Thanks for any help.

    You need to stick with using Firewire drives. Even on modern iMacs using Firewire is preferable as it out-performs USB2.0 in data transfer rates. 
    mrtotes

  • File.length() too slow?

    I'm working on an application that needs to process directory listings on network fileserver, looking for files and directories, and getting basic info about files found (size, last modified date).
    I'm trying to use java.io.File.list(java.io.FileFilter) with java.io.FileFilter returning file.isDirectory() or file.isFile() to get a list of just files or directories, and try to get the rest of file information later for each of the returned Files. However, when it gets to a directory with a lot of files (13000+), it seem to be unacceptably slow. I have tracked it down to File.length(), taking up to 80ms per file (!), which amounts to only about 13 files per second.
    It's not a problem of the platform (Win XP), directory listing contianing all information I need takes less than 3 seconds for this big directory, while getting the same through Java APIs (calling isDirectory(), isFile(), length() and lastModified() within the FileFilter callback ) takes ages.
    Is there a better way to get a directory listing, without being orders of magnitude slower than necessary? I can think of calling native dir command and parsing the output, but that is a mess...

    I have tracked this down to native implementation of File.length() - VC++ runtime function _stati64 which they use to get file length is too slow.
    Why dont they use Windows API? I have tested that getting file size using GetFileAttributesEx() is at least 50x faster than _stati64 for my file!                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • File sharing too slow

    Hi,
    I have 2 macs connected via hub. My internet connection is very fast and everything is ok but file sharing is too slow. 1GB takes 8 hours to be transfered.
    Any fix?
    Thanks!

    yup same here, 9 gb file shared via airport extreme from my Macpro 8proc to my 17" Macbook Pro was going to take 4 hrs?????? doing it via firewire now, 10 minutes.
    is there a network setting in missing as both machines obviously have 902.11N and the extreme is setup for N (B/G compatible)??
    jas

  • Very large file 1.6 GB..sports program for school. lots of photos, template where media spots saved and dropped photos in..trying to get it to printer swho says its too large but when reduces size loses all quality...HELP??

    Please help...large file my sons Athletic program. Had noto problem with Quality last year..this new printer said file is too large..too many layers, can I FLATTEN..He reduced size and photos look awful. It is 81 pages. Have to have complete before next friday!! Help anyone??

    At that size it sounds like you have inserted lots of photos at too large a size and resolution.
    Especially for a year book style project, first determine your image sizes and crop and fix the resolution to 300 dpi, then drag them into the publication. Pages does not use external references to files and rapidly grows to quite large size unless you trim and size your images before placement.
    Peter

  • Please Help! All Sounds in Classic are too slow now!

    Something dreadful happened (wish I knew what) and now every sound that is played back by any Classic application is about 70% too slow.
    No matter if it is a Classic game like Unreal (I), Deus Ex, AvP for OS9 (the OSX version works fine), or Quicktime 6, or any other given Classic application. The only sounds that appear normal are the Platinum system sounds.
    I don't know which files affect Classic sound playback or where they are located. Can't imagine what could cause that particular problem.
    Hope someone could help...
    Many thanks in advance
    M:A:
    PowerMac G5 2x2 GHz Mac OS X (10.4.6) 1.5 GB RAM, standard sound card (TI-TAS3004)
    PowerMac G5 2x2

    Sorry, didnt get an email notification, so I just checked the thread.
    Starting and quitting Garage Band actually helped!
    (Imagine listening to the meaningful dialogues in "Deus Ex" spoken with Sesame Street's Cookie Monster voice. Aaaaannnd reeealllly sllloowwww. Doesn't quite capture the mood...)
    Do you have an explanation why Garage Band should affect Classic sound output?
    Thanks a lot,
    Martin

  • HT1338 My mac is becoming too slow. It takes long to open word documents, pdf files or excel documents or even safari. Can anybody suggest something? I have tried to reduce the number of open applications, but does not seem to work.

    My mac is becoming too slow. It takes long to open word documents, pdf files or excel documents or even safari. Can anybody suggest something? I have tried to reduce the number of open applications, but does not seem to work.

    Hi ...
    Checked to see how much free space there is on the startup disk lately?
    Right or control click the MacintoshHD icon. Click Get Info. In the Get Info window you will see Capacity and Available. Make sure there's a minimum of 15% free disk space.
    Freeing Up Hard Disk Space - Mac GuidesFreeing Up Hard Disk Space - Mac Guides
    If disk space is not the issue, booting in Safe Mode deletes system caches that may help.
    A Safe Mode boot takes longer then a normal boot so be patient.
    Once you see the Desktop, click the Apple menu icon top left corner of the screen.
    From the drop down menu click Restart.
    See if that makes a difference ...

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Disk too slow / System overload HELP?

    I'm really hoping someone can help me out with this!:
    I purchased a G4 PowerBook around six months ago, with Logic Pro pre-installed. Since then, I have mixed and mastered lots of music that was recorded on a previous system and imported. I was running multiple tracks simultaneously with tons of plug-ins and everything ran smoothly on my new Mac. HOWEVER, within the last month, my playback capabilities have been severely reduced. When I try to play back even two tracks simultaneously (with just one compressor or reverb), I get the error message, "Disk too slow / System overload."
    My first reaction to this problem was to check my disk space and see how much room I had left: it turns out I have over one third of my harddrive space still available. I checked my memory capabilities and they seem to be more than adequate. I bought new virus protection, too, and ran scans to see if that might make a difference, but it hasn't. I don't understand why this is happening on a system so new. ANY ADVICE YOU CAN OFFER WOULD BE GREATLY APPRECIATED!
    Many thanks.
    Powerbook G4   Mac OS X (10.4.6)  

    It is generally not a good idea to run logic on the same drive that contains your audio files and samples. Most of the data involved with logic whether it be programs or individual audio files will be saved physically all over the place on your hard drive. At a certain point the actual part of the drive that reads the data will not be able to move around to all the little different audio files it needs to access for the session on top of running the program and whatever other software that is up. It is best to save all your audio files and samples to a fast external drive (7200rpm at least) otherworldcomputing.com has great drives that have high quality cases and the option to put any drive they have in stock into the case. (i would recommend seagate) If you are going to continue running your entire sessions off the internal drive then you need to keep up with your basic mac housekeeping. You should repair your permissions 2 times a month (use disk utility) You should buy a good directory repair program such as disk warrior. The directory is exactly what it sounds like. It keeps track of where everything is on the computer. If your directory is corrupt the computer will have to do an extra manual search to find what its looking for. If your directory is healthy your computer will make a straight shot to the file you are looking for.

  • "This file attachment is too large to be viewed." What?

    I created a 15.6MB m4v for iPhone using compressor to test being able to email clips to a client for review on his iphone. A couple of strange things. First, when the message is received, the attachment now reads that it is 20.6MB in size. Second, when I attempt to open it on my iPhone... it gives me the message "This file attachment is too large to be viewed."
    Is there a limitation with the iPhone here that I didn't know about?

    Yes this is a public board, which shouldn't exclude common courtesy.
    This is a user to user help forum only and fellow users that post answers to questions or provide suggestions here volunteer their time to do so.
    I don't have a problem with you providing what you think is a shortcoming, but there is nothing wrong with me providing a reasons or reasons why it isn't a shortcoming for the most part. If that is being "needlessly defensive", then you are needlessly interested only in thoughts that agree with your own.
    What does that mean exactly? I wasn't suggesting they would, they usually use their own dedicated mail servers... which they can set to unlimited if they wish... which is why putting the limitation on the iphone pulls the rug out from under them if they want to use the iPhone AND send/receive large files via email. Pretty simple equation really.
    What it means exactly is, you were talking about business use and you mentioned Gmail having a 20MB overall message size limit. Yes many businesses use their own dedicated server, which they can set to unlimited if they wish, but this will only work internally and with other businesses that do the same.
    I'll bet you dollars to donuts the overwhelming majority of businesses with or without their own dedicated email server do not have an unlimited overall message size limit. Some may have a larger overall message size limit than 10MB, but 10MB is a very common limit.
    So any way in which the iPhone can become more useful as a business device is a good thing... isn't it?
    Certainly, but what you consider as being more useful may or many not match what I or someone else may consider as being more useful and there is nothing wrong with that.
    Perhaps you will never need or otherwise enjoy landscape email, full screen attached video playback, or attachments greater than 10MB... but to not see the obvious improvement as a business device had these features been implemented in 2.0.2 just doesn't make any sense to me.
    I have been discussing overall message size limits only. I haven't mentioned anything about these other items.

  • Large Photoshop file has caused system slow down.

    I have Dual 2.0Ghz G5 which has worked very well since I got it last Spring. Just recently however I've had some slow downs. There are a number of things that I've noticed such as more frequent spinning beachball and font redraw issues.
    These issued started since I was working one a massive Photoshop file. I was doing work on a Trade Show booth which eventually resulted in a 10Gb Photoshop file! I realized I probably should have taken steps to reduce the size of the file but that's neither here nor there. Despite the size of the file I was still quite impressed with the performance of my machine. That said, I've found the general performance of the machine since working on this file to be diminished.
    I've done all of the typical maintenace items such as repairing disk and permissions via Disk Utility, clearing caches and running Maintenance scripts using Cache Out X, MacJanitor and Tiger Cache Cleaner, but still the performance of my machine has taken a hit.
    It's only since this has happened that I've started launching Activity Monitor to try and determine if there is something obvious going on which is hogging resources or not. This for me has raised questions of what is "Normal" for some common tasks. Presently kernal_task is using under 5% of my CPU with 50 threads running which seems like a reasonable number. But it is also using almost 110Mb of RAM and 1.35Gb of Virtual Memory. Both of those numbers seem quite high to me, but I admit that I never looked at them before so I have no gage as to whether these numbers can be considered normal or not.
    By all measures the machine still performs pretty good, but as I use this machine every day I know that it's not the same as it was prior to working on that large file. Is there something else related to Photoshop, temporary files, application caches or anything like that, that I should take a look at?
    Any input would be appreciated.
    Thanks!
    Dual 2.0Ghz G5   Mac OS X (10.4.4)  

    Hi Troy,
    First I would try using a separate HD for the scratch disk.As PS tries to read and write,it has to compete with the OS as it tries to hit its swop files. If Version Cue is on, turn it off.Get rid of any virus checking apps.Check your Ram.If you haven't allowed Tiger to run its cron scripts on a regular basis,and only do periodic maintenance, I would take the time to do an Archive and Install, or even wipe the HD,zero it and install tiger again. This I realize is a drastic measure,time
    consuming, and a pain.But if your whole system is slowing down?? Or if you get a new drive set it up to be your primary drive first and test your system on it before moving any of your PS files over from the old HD.Use the old HD as scratch disk
    Have you included the use of other utilies to check your HD? Drive Genius, Disk Warrior, TechTool Pro.No Norton utilities allowed.Hope this helps as there are a lot of places to look for the problem.

  • "File size is too large"

    When I imported the only 4 files in a particular folder (scans of photos) into the Organiser, the message "File size is too large" came up for two of them instead of preview images, although do appear to have been added to the Catalog as their MB size, pixel dimensions and creation date show in the Properies General tab.
    I take it this must relate to pixel count and not the file size per se as the two tifs (13.2 and 13.3 MB but 1792 x 2508 and 2518 x 1798 pixels) are fine, while the two JPGs (12.9 and 30.8 MB and 9616 x 6480 and 7048 x 10376) are the ones without preview images?
    I wasn't aware of this as a problem, so can anybody tell me please, have I just set too low a limit somewhere or is there a real restriction?
    And is this new to PSE6 or something that's been around for a while?
    Thanks, Peter

    I have a 212mb psd file (a panorama).   I have Elements 6.0 and 4 meg RAM, Vista.
    I went through the regedit process as outlined in Adobe Technote kb402760 and upped it to 1,000,000,000.  I would think that would have given enough headroom for my file size.  But it didn't.  I still have the generic "file size too large" on my thumbnails.
    I did note that the tech note had a different location for the Software/Adobe/Elements 6.0/Organizer (HKEY_LOCAL_MACHINE\ ) from the location in my file structure.  In fact, this string was listed in at least three places OTHER THAN the one indicated in the Technote.  How likely is it that adding "MaxImageSize" to the Elements 6.0 register locations other than specifically indicated is the reason why the file size capacity was not increased.  Why doesn't Elements show up in the location indicated by the Tech Note?
    NOTE FOR ADOBE:  Why isn't this a more easily accessible option in the user interface?  If I have a large file that slows things down a bit on my thumbnails, that could be my choice.  This artificial constraint built into the regedits that no one goes to is too much like parental controls for 5 year olds.

Maybe you are looking for

  • IChat AV does not work behind firewall even after opening up correct ports

    Hi. I am trying to get my iChat setup so that I can do video and audio. It works fine when my mac is directly plugged into the cable modem. When I plug it into my firewall it will not connect. I have read many online manuals and forums and I have tri

  • How to call a report without BI Publisher-authentication

    Could we call a report without BI Publisher-authentication ? Our infrastructure needs some reports that can be call with no authentication. Is there a configuration to do this? Any help will be really appreciated. Thanks

  • Video on YouTube won't play !

    Video on YouTube won't play Until I refresh the page? However, when the video ends okay then can not play the video again? Video plays only once then had to refresh the page in order to watch the same video?

  • Simple SQL Query and Parameters and LOV

    Newbie and trying to work thru building a simple sql query with a single table query and use a parameter and lov. Can anyone point me to an example. simple query: select cust_id, name_desc, name_add1, name_add2, name_city from customer_table where cu

  • Exporting project- folder structure to other Non-Aperture  applications

    Hello, I'm about to lay the foundation for a new Aperture installation. As far as organizational structure goes, this what I have in mind: 1 Project / year 1 Album / event Example: Project 1993 Derek's Birthday album Linda's wedding album Project 199