How to manage huge (3 gb+) files in photoshop

I have started creating 3gb+ files in CS2 photoshop and my computer is taking 3 minutes to open and 10 minutes to save, etc - driving me mad with the delays. My system (3.166 mhz core duo, ASUS P5K SE/EPU motherboard, 4GB Kingston DDR2 800 RAM, Quadro FX540 video card) copes well with 300MB files but not with these ones.
Recently I moved my OS to Windows 7 Professional 64-bit in the hope that things would improve but any change was marginal.
The files are multi-layered designs, 150 dpi, about 16 feet by 10 feet and 1.8GB flattened when they are printed.
Whilst I know that the designs are pushing the boundaries/restrictions of photoshop I would appreciate any views of members who have figured out how to manage huge files. Any suggestions welcomed, whether hardware/software upgrades, photoshop hints (but the dpi and size cannot change), etc.

Thanks,
you've all been helpful. My files were being saved as Photoshop not tiffs but I found a 30 day free trial of CS4 Photoshop and that seems to be making a difference.  It looks like I'll have to purchase that along with some more ram.  My computer is a money pit!
Thanks again

Similar Messages

  • How do I open an illustrator file in photoshop with layers

    How do I open an illustrator file in photoshop with layers

    No I don't. Thanks for your reply. It had come from someone as a Ai file and I need to access the layers
    Sent from my iPhone

  • How do I open my raw files in photoshop elements 12 editor-..my files convert to jpegs

    How do I open my raw files in photoshop elements 12 editor.  I have a MacBook. I use the iPhoto and move the files I want to work on into elements, but it converts the raw files into jpeg files.
    Thanks for your help!!!

    Are you opening the image directly from iPhoto? Right click & choose Edit in >> Photoshop Elements Editor
    This link is old but gives information on set up. It relates to PSE11 but the principle is the same. In iPhoto prefs you navigate to the editor application in the support files folder. Start from Finder by holding down the Alt (Optn key) and choosing Library.
    http://helpx.adobe.com/photoshop-elements/kb/photoshop-elements-iphoto-mac-os.html

  • How do I convert a pdf file to photoshop?

    how do I convert a pdf file to photoshop?

    unless you understand what you're doing and why, opening .pdf in Photoshop is a very risky move
    because any vector objects, including fonts, will be pixelized (rastered at the chosen document ppi)
    this really sucks if the designer delivered a professionally packaged PDF with vector elements because the vector will not print vector sharp
    for example
    a moron at a print shop drags PDF business cards or posters into Photoshop and prints pixelized vector objects -- when you ask why the vector didn't print sharp -- they are clueless
    you also may hit profile issues that change the designer's color
    there must be 50 ways to ruin a project (and 25 of them are downstream)...

  • How do I save an install file for Photoshop CS6

    I need to get an install file for Photoshop CS6 (not CR) that I can use on a Windows PC that, due to security reasons, does
    not have access to the internet.  In the past I have been able to save the install file for Photoshop and/or Lightroom on my internet connected PC and then move it to the the target PC on a thumb drive or CD.  But now with the CC and the new "Adobe Installer" I can't seem to see how to capture the install file. 
    Please let me know how to do this.
    dan

    Refer to this:
    Direct Download Links for Adobe Software
    Mylenium

  • How do I get a text file from Photoshop  to work in the main sequence in pp?

    How do I get a text file to work properly in the master sequence. I moved it from Photoshop, which I learned to do from a tutorial, but when I move the animated text sequence to the master, it either isnt running, or it is scaled way too big. How do I get it to run in the main sequence?

    "Wont Work Here" !  Does not mean much.
    Are you having an audio or a video issue? 
    Looks like no video clip on the video layer above that section of audio.
    I am teaching myself this stuff completely on the fly
    I suggest you do the Basic Tutorials ( Adobe TV for example) in both Premiere Pro and PhotoShop.
    You need to be competent in the basics and fundamentals of these apps and that will also help you describe and discuss the issues.   Check the 'Products on this site....
    Adobe TV

  • How to create keyboard for Load Files into Photoshop

    does anyone know if there is a way to create or is there an existing keyboard shortcut to Load Files into Photoshop from Bridge?

    I should have been more specifc.
    Much like in bridge-tools-photoshop-load files into photoshop, I want to load multiple files into a layered photoshop document.
    Can't seem to find a keyboard shortcut for this action. Though I don't know how to write scripts very well but there are far too many keyboard shortcut posts I've searched and can't seem to find a way.
    thanks

  • I have a Fuji X-Pro1 camera. How do I open its RAW files using Photoshop CS5. I use Mac OSX Lion 10.7.5

    I have a Fuji X-Pro1 camera. I use Mac OSX Lion 10.7.5  I have Photoshop CS5.  How do I open its RAW files?

    Since photoshop cs5 can't use past the camera raw 6.7 plugin and that camera requires at least the camera raw 7.1 plugin, you could use the 8.7.1 dng converter
    to convert the files to dng copies, which then photoshop cs5 should be able to open.
    Adobe - Adobe Camera Raw and DNG Converter : For Macintosh : Adobe DNG Converter 8.7.1
    how to use the dng converter
    Camera Raw: How to use Adobe DNG Converter - YouTube

  • How to manage huge file size documents in SharePoint 2013?

    we have different kinds of files
    .avi, .wmv, mpeg, .mpg, .m1v, .mp2, .wav, .snd, .au,.mdb,
    .db,.bat, .exe, .dll,
    and .jar
    huge in file size.
    appreciated solutions for DMS. like sharepoint and file server or any other solutions.

    You have a hard limit of 2Gb which you cannot get around with Remote Blob Storage or anything else. Beyond that size, you're going to have to store content outside SharePoint with links to the content for your readers, of using Search Indexing on the file
    location.
    If the files are large but not greater than 2Gb they can be stored in SharePoint, but you should be aware of the supported limits for Content Database size, which I recall as 100Gb for a single Content DB, or up to 200Gb for a rarely updated Content DB containing
    a single site collection.
    w: http://www.the-north.com/sharepoint | t: @JMcAllisterCH | YouTube: http://www.youtube.com/user/JamieMcAllisterMVP

  • How to move huge HD video files between external hard drives and defrag ext drive?

    I have huge high definition video files on a 2TB external hard drive (and its clone).  The external hard drive is maxed out.  I would like to move many of the video files to a new 3TB external hard drive (G-drive, and a clone) and leave a sub-group of video files (1+ TB) on the original external hard drive (and its clone).  
    I am copying files from original external drive ("ext drive A") to new external drive ("ext drive B") via Carbon Copy Cloner (selecting iMovie event by event that I want to transfer). Just a note: I do not know how to partition or make bootable drives, I see suggestions with these steps in them.
    My questions:
    1.)  I assume this transfer of files will create extreme fragmentation on drive A.  Should I reformat/re-initialize ext drive A after moving the files I want?  If so, how best to do this?  Do I use "Erase" within Disk Utilities?  Do I need to do anything else before transfering files back onto ext drive A from its clone?
    2.) Do I also need to defrag if I reformat ext drive A? Do I defrag instead of or in addition to reformating?  If so, how to do this? I've read on these forums so many warnings and heard too many stories of this going awry.  Which 3rd party software to use? 
    Thank you in advance for any suggestions, tips, advice.  This whole process makes me SO nervous.

    Here is a very good writeup on de-fragging in the OS environment that I borrowed
    From Klaus1:
    Defragmentation in OS X:
    http://support.apple.com/kb/HT1375  which states:
    You probably won't need to optimize at all if you use Mac OS X. Here's why:
    Hard disk capacity is generally much greater now than a few years ago. With more free space available, the file system doesn't need to fill up every "nook and cranny." Mac OS Extended formatting (HFS Plus) avoids reusing space from deleted files as much as possible, to avoid prematurely filling small areas of recently-freed space.
    Mac OS X 10.2 and later includes delayed allocation for Mac OS X Extended-formatted volumes. This allows a number of small allocations to be combined into a single large allocation in one area of the disk.
    Fragmentation was often caused by continually appending data to existing files, especially with resource forks. With faster hard drives and better caching, as well as the new application packaging format, many applications simply rewrite the entire file each time. Mac OS X 10.3 onwards can also automatically defragment such slow-growing files. This process is sometimes known as "Hot-File-Adaptive-Clustering."
    Aggressive read-ahead and write-behind caching means that minor fragmentation has less effect on perceived system performance.
    Whilst 'defragging' OS X is rarely necessary, Rod Hagen has produced this excellent analysis of the situation which is worth reading:
    Most users, as long as they leave plenty of free space available , and don't work regularly in situations where very large files are written and rewritten, are unlikely to notice the effects of fragmentation on either their files or on the drives free space much.
    As the drive fills the situations becomes progressively more significant, however.
    Some people will tell you that "OSX defrags your files anyway". This is only partly true. It defrags files that are less than 20 MB in size. It doesn't defrag larger files and it doesn't defrag the free space on the drive. In fact the method it uses to defrag the smaller files actually increases the extent of free space fragmentation. Eventually, in fact, once the largest free space fragments are down to less than 20 MB (not uncommon on a drive that has , say only 10% free space left) it begins to give up trying to defrag altogether. Despite this, the system copes very well without defragging as long as you have plenty of room.
    Again, this doesn't matter much when the drive is half empty or better, but it does when it gets fullish, and it does especially when it gets fullish if you are regularly dealing with large files , like video or serious audio stuff.
    If you look through this discussion board you will see quite a few complaints from people who find that their drive gets "slow". Often you will see that say that "still have 10 or 20 gigs free" or the like. On modern large drives by this stage they are usually in fact down to the point where the internal defragmentation routines can no longer operate , where their drives are working like navvies to keep up with finding space for any larger files, together with room for "scratch files", virtual memory, directories etc etc etc. Such users are operating in a zone where they put a lot more stress on their drives as a result, often start complaining of increased "heat", etc etc. Most obviously, though, the computer slows down to a speed not much better than that of molasses. Eventually the directories and other related files may collapse altogether and they find themselves with a next to unrecoverable disk problems.
    By this time, of course, defragging itself has already become just about impossible. The amount of work required to shift the data into contiguous blocks is immense, puts additional stress on the drive, takes forever, etc etc. The extent of fragmentation of free space at this stage can be simply staggering, and any large files you subsequently write are likely to be divided into many , many tens of thousands of fragments scattered across the drive. Not only this, but things like the "extents files", which record where all the bits are located, will begin to grow astronomically as a result, putting even more pressure on your already stressed drive, and increasing the risk of major failures.
    Ultimately this adds up to a situation where you can identify maybe three "phases" of mac life when it comes to the need for defragmentation.
    In the "first phase" (with your drive less than half full), it doesn't matter much at all - probably not enough to even make it worth doing.
    In the "second phase" (between , say 50% free space and 20% free space remaining) it becomes progressively more useful, but , depending on the use you put your computer to you won't see much difference at the higher levels of free space unless you are serious video buff who needs to keep their drives operating as efficiently and fast as possible - chances are they will be using fast external drives over FW800 or eSata to compliment their internal HD anyway.
    At the lower end though (when boot drives get down around the 20% mark on , say, a 250 or 500 Gig drive) I certainly begin to see an impact on performance and stability when working with large image files, mapping software, and the like, especially those which rely on the use of their own "scratch" files, and especially in situations where I am using multiple applications simultaneously, if I haven't defragmented the drive for a while. For me, defragmenting (I use iDefrag too - it is the only third party app I trust for this after seeing people with problems using TechToolPro and Drive Genius for such things) gives a substantial performance boost in this sort of situation and improves operational stability. I usually try to get in first these days and defrag more regularly (about once a month) when the drive is down to 30% free space or lower.
    Between 20% and 10% free space is a bit of a "doubtful region". Most people will still be able to defrag successfully in this sort of area, though the time taken and the risks associated increase as the free space declines. My own advice to people in this sort of area is that they start choosing their new , bigger HD, because they obviously are going to need one very soon, and try to "clear the decks" so that they maintain that 20% free buffer until they do. Defragging regularly (perhaps even once a fortnight) will actually benefit them substantially during this "phase", but maybe doing so will lull them into a false sense of security and keep them from seriously recognising that they need to be moving to a bigger HD!
    Once they are down to that last ten per cent of free space, though, they are treading on glass. Free space fragmentation at least will already be a serious issue on their computers but if they try to defrag with a utility without first making substantially more space available then they may find it runs into problems or is so slow that they give up half way through and do the damage themselves, especially if they are using one of the less "forgiving" utilities!
    In this case I think the best way to proceed is to clone the internal drive to a larger external with SuperDuper, replace the internal drive with a larger one and then clone back to it. No-one down to the last ten percent of their drive really has enough room to move. Defragging it will certainly speed it up, and may even save them from major problems briefly, but we all know that before too long they are going to be in the same situation again. Better to deal with the matter properly and replace the drive with something more akin to their real needs once this point is reached. Heck, big HDs are as cheap as chips these days! It is mad to struggle on with sluggish performance, instability, and the possible risk of losing the lot, in such a situation.

  • How to manage huge amount of data into OBIEE 11g?

    Hi all Experts,
    I have some business requirements for a BANK where I need to get data for around 2 crores accounts with different product lines with 50 columns.
    from a staging table generated from data ware house. .
    ** I don't need to manage any modeling and business model based criteria (dimension and fact) its going from direct database request.
    how to handle and make the report output faster.
    ***If I  create the same report from OBIEE rpd based subject area , presentation tables (with filters to get less no of rows) than it never comes up with result any result and fails than return errors.
    any suggestion will help a lot.
    Thanks in advance
    Raj M

    "if the product does not peform"...
    Let's put the problem into a perspective that guys (which I assume we all are for the sake of the argument): cars.
    Using your direct database request as a starting point it's a bit like trying to use a Ferrari to pull a camper. Yes, it will work, but it's the wrong approach. Likewise (or conversely) it's a pretty bad idea to take a Land Rover Defender around the Nurburg Ring.
    In both cases "the product" (i.e. the respective cars) "will not perform" (i.e. not fulfill the intended duties the way you may want them to).
    I never get why everyone always bows to the most bizarre requests like "I MUST be able to export 2 million rows through an analysis exposed on a dashboard" or "This list report must allow scrolling through 500k records per day across 300 columns".

  • How to manage a MS office file

    I'd like to choose the best apps to manage my work docs, we work in  LAN with Windows 7 Pro as OS & MS office 2010 Pro as suite.
    have you ever proove the IWorks with A complex spreadsheet in excel?
    is there another apps betterk than IWorks?
    Unfortunally, Oppen Office (FSF) doesn't work whith Ipad 2

    Hello there,
    I`m having the following task - I need to lock a certain MS Office files (Word, Excel...) to be viewed only by certain people on my domain, which already had a digital certificate which is issued by AD Certification Authority, running on Windows Server 2012 R2.
    The task is that these certain people can view the files but if they copy the files to another location or send them by e-mail or other way the files to be locked even for openening.
    I`m not pretty sure if this can be done only with the mentioned certificates so I`m open to any suggestions of the topic.
    Thanks in advance for the info and the help.
    This topic first appeared in the Spiceworks Community

  • How to manage Podcasts as normal files?

    I have Podcasts that I have updated and auto-deleted regularly. Sometimes, I want to treat a few files differently, and save them or put them in smart playlists while not having them show up on the main Podcast menu of mine. However, as soon as I delete them from my Podcasts folder they disappear from everywhere else. How can I stop that?
    Alternatively, if I right-click an individual episode and mark it to not be auto-deleted, will it apply to just that episode, or the entire Podcast?

    I'm marking this answered to get rid of it and I made another thread on the subject anyway. I'm still really annoyed that this can't be done though.

  • Newbie: How to manage the number of files in a folder

    I have a backup application that runs each day and creates a file in a folder.
    I have an iCal scipt which automates this backup to run at a specific time but I would like to modify it so that it deletes the oldest file in the folder. I cannot figure out the best way to do this. Does anyone have any suggestions?

    This rather long script finds the oldest file in the complete nested range of a chosen outer folder and deletes it.
    property filePathList : {}
    property fileDateList : {}
    getEveryFileName((choose folder) as alias)
    Sort_items(fileDateList, filePathList)
    set oldest to item 1 of filePathList
    tell application "Finder" to delete oldest
    to getEveryFileName(theFolder)
    tell application "Finder"
    my getNamesDates(get files of theFolder)
    set FolderList to folders of theFolder
    repeat with everyFolder in FolderList
    my getEveryFileName(everyFolder)
    end repeat
    end tell
    end getEveryFileName
    to getNamesDates(FileList)
    repeat with everyFile in FileList
    try
    tell application "Finder"
    set end of fileDateList to modification date of everyFile
    set end of filePathList to everyFile as alias
    end tell
    on error anError
    display dialog anError
    end try
    end repeat
    end getNamesDates
    to Sort_items(sortList, SecondList)
    -- sorts by the first list (dates), while keeping the second list in the same order.
    tell (count sortList) to repeat with i from (it - 1) to 1 by -1
    set s to sortList's item i
    set r to SecondList's item i
    repeat with i from (i + 1) to it
    tell sortList's item i to if s > it then
    set sortList's item (i - 1) to it
    set SecondList's item (i - 1) to SecondList's item i
    else
    set sortList's item (i - 1) to s
    set SecondList's item (i - 1) to r
    exit repeat
    end if
    end repeat
    if it is i and s > sortList's end then
    set sortList's item it to s
    set SecondList's item it to r
    end if
    end repeat
    end Sort_items
    dual-core G5/2.3   Mac OS X (10.4.6)   dual screens

  • How to manage two OS Text files in UTL_FILE Package simultaneously?

    Hi
    I am using UTL_FILE Package in order to insert rows into my table from the OS text files.
    It inserts the row in one of the column in the table successfully.
    But when I change the code in following routine in order to insert two rows from the two OS text files, it fails and no row inserted in any column of the table.
    Could someone assist what changes I will do in the following script in order to insert the rows in both the columns of the table from the two different OS Text files simultaneously?
    The UTL_FILE_DIR parameter is set to * .
    I am using Oracle 8.1.7.
    Following is the procedure, which I am trying to insert the rows in two columns at a time:
    Regards
    Sharbat
    Declare                              
    l_file_handle1 UTL_FILE.FILE_TYPE;
    l_file_handle2 UTL_FILE.FILE_TYPE;                              
    l_buffer1 VARCHAR2(4000);
    l_buffer2 VARCHAR2(4000);                              
    BEGIN                              
    l_file_handle1 := UTL_FILE.FOPEN('c:\Test\Result', 'System_Name.txt', 'r', 4000);                              
    l_file_handle2 := UTL_FILE.FOPEN('c:\Test\Result',
    'Machine.txt', 'r', 4000);
    loop
    UTL_FILE.get_line(l_file_handle1,l_buffer1);
    UTL_FILE.get_line(l_file_handle2,l_buffer2);
    insert into test (Hostname,Machine) values(l_buffer1,l_buffer2);
    commit;
    end loop;
    exception
    when no_data_found then
    UTL_FILE.FCLOSE(l_file_handle1);
    UTL_FILE.FCLOSE(l_file_handle2);
    when others then
    if utl_file.is_open(l_file_handle1)
    then
    utl_file.fclose(l_file_handle1);
    end if;
    if utl_file.is_open(l_file_handle2)
    then
    utl_file.fclose(l_file_handle2);
    end if;
    end;

    I recommend you to post this here to get fast answer:
    PL/SQL
    Joel Pérez

Maybe you are looking for

  • Passing the values in Recipe  PLM WEBUI

    Created a new tab for Recipe via customization in PLM Web UI. But I am not able to  create new fields in the screen and pass the values. Please let me know the configuration steps and how to pass the values to PLM Web UI

  • Need help how to install and use html db

    plz any body can help me i have download htmldb2.0 and want to install and use it with oracle but i do not know how to install and use it first time any body can help me in detail . Niazi

  • BPM process interactive activity(JSP) - external webservice method interac

    I am using Oracle BPM studio 10.3.1.0. I have one external web service published on glassfish application server, I have introspected it in my BPM process using its WSDL. Now one of my BPM process interactive activity is there, which is represented b

  • Understanding explain plan output

    Hi, Version 11202. Bellow is the output of tkprof against a statment which has cpu time = elapsed time =~3000 sec. The statment is related to oracle EBS , but the question bellow is regarding the execution plan. As you can see in the plan bellow most

  • Problem with work repository

    Hi, If someone could help me, it would me much appreciated. So, I currently have an Oracle DB I'm using has my work repository and I'm currently working on ODI (so the DB is working fine). But when I try to run an interface one time out of two, somet