Large tff files in cs2

I have cs2, old I know, and find I can't open large tff or raw files, only jpeg. I want to use the non-compress files from my camera. Any suggestions? I heard there was a update a few years ago to allow such. Any help will be appreciated. I don't use the program enough to buy the new version.
thanks, dale

Which operating system are you using?
Is photoshop cs2 updated to version 9.0.2 and camera raw 3.7?
(you can get the photoshop updates from here: Adobe - Photoshop : For Windows)
What happens when you try to open the files?
Have you tried restting the photoshop cs2 preferences?
By Raw files, do you mean camera raw files from a camera and if so, which one?
How big in terms of pixel dimensions are the tif files your trying to open and are they from your camera or some other source?

Similar Messages

  • Speeding up a dualcore G5 to work with large Photoshop files

    Hi
    I have a 2.3Ghz dualcore G5 (the one that came out late 2005).
    I'm working on a bunch of large format film scans (500Mb +) in Photoshop CS2, and I'm trying to speed things up.
    This last week I've installed two upgrades that have helped get things moving - first I upgraded the RAM from 4.5Gb to 12 Gb, then I installed a second hard drive (Seagate 7200.11 500Gb, with jumper in place) to use as a dedicated scratch disk.
    Both upgrades have given a significant speed boost, but I'm now wondering what else I can do????
    I want to speed up the time that it takes to open and save these large scans.
    My first thought was to buy a second Seagate 500Gb drive as a replacement for the original 250Gb WD drive. I would then have two 500Gb internal drives that I could configure as a RAID 0 array with disk utility. I would then clone my original 250Gb onto the RAID 0 drives with Super Duper.
    Wouldn't such a set-up double the speed of opening and saving large Photoshop files?
    I realise that with RAID 0 there is an increased chance of data loss from disk failure (double the chance?), but if I back up daily to a 1Tb external that should be ok, no?
    Or should my next move to be to utilise the PCI-E slots (which I don't really understand)????
    Thanks for any advice you can offer.
    Richard

    In my G5 Quad, I find the fastest Photoshop performance overall- especially with large-file open and saves, occurs when the setup is as follows:
    Startup disk is a 2 x 150G Raptor RAID0 which contains the system, apps, and all user files- including the image file being worked on.
    PS scratch is then pointed either to a third fast drive or to an external RAID0 array. This setup is substantially faster than:
    ....the safer method of using one Raptor for system and apps, and the other for user and image files, with a third drive for PS scratch.
    With a really fast large scratch disk, you can sometimes put the image file on the same volume as the scratch file, open and save it from there, and you'll get a bit of an additional boost so long as the scratch disk is faster than the startup disk.
    For CS3 (I believe it works for CS2 as well), a performance plugin called DisableScratchCompress.plugin is available which can also speed things up with a fast scratch disk. I found that using this plugin speeded things up but only if the image file was opened/saved from the first config I mentioned; it was slower if placed on the scratch disk while using the DSC plugin.
    More here: Photoshop Acceleration Basics
    Of course if you stripe a disk with data, be sure to frequently back it up..:)

  • Arbitrary waveform generation from large text file

    Hello,
    I'm trying to use a PXI 6733 card hooked up to a BNC 2110 in a PXI 1031-DC chassis to output arbitrary waveforms at a sample rate of 100kS/s.  The types of waveforms I want to generate are generally going to be sine waves of frequencies less than 10 kHz, but they need to be very high quality signals, hence the high sample rate.  Eventually, we would like to go up to as high as 200 kS/s, but for right now we just want to get it to work at the lower rate. 
    Someone in the department has already created for me large text files > 1GB  with (9) columns of numbers representing the output voltages for the channels(there will be 6 channels outputting sine waves, 3 other channels with a periodic DC voltage.   The reason for the large file is that we want a continuous signal for around 30 minutes to allow for equipment testing and configuration while the signals are being generated. 
    I'm supposed to use this file to generate the output voltages on the 6733 card, but I keep getting numerous errors and I've been unable to get something that works. The code, as written, currently generates an error code 200290 immediately after the buffered data is output from the card.  Nothing ever seems to get enqued or dequed, and although I've read the Labview help on buffers, I'm still very confused about their operation so I'm not even sure if the buffer is working properly.  I was hoping some of you could look at my code, and give me some suggestions(or sample code too!) for the best way to achieve this goal.
    Thanks a lot,
    Chris(new Labview user)

    Chris:
    For context, I've pasted in the "explain error" output from LabVIEW to refer to while we work on this. More after the code...
    Error -200290 occurred at an unidentified location
    Possible reason(s):
    The generation has stopped to prevent the regeneration of old samples. Your application was unable to write samples to the background buffer fast enough to prevent old samples from being regenerated.
    To avoid this error, you can do any of the following:
    1. Increase the size of the background buffer by configuring the buffer.
    2. Increase the number of samples you write each time you invoke a write operation.
    3. Write samples more often.
    4. Reduce the sample rate.
    5. Change the data transfer mechanism from interrupts to DMA if your device supports DMA.
    6. Reduce the number of applications your computer is executing concurrently.
    In addition, if you do not need to write every sample that is generated, you can configure the regeneration mode to allow regeneration, and then use the Position and Offset attributes to write the desired samples.
    By default, the analog output on the device does what is called regeneration. Basically, if we're outputting a repeating waveform, we can simply fill the buffer once and the DAQ device will reuse the samples, reducing load on the system. What appears to be happening is that the VI can't read samples out from the file fast enough to keep up with the DAQ card. The DAQ card is set to NOT allow regeneration, so once it empties the buffer, it stops the task since there aren't any new samples available yet.
    If we go through the options, we have a few things we can try:
    1. Increase background buffer size.
    I don't think this is the best option. Our issue is with filling the buffer, and this requires more advanced configuration.
    2. Increase the number of samples written.
    This may be a better option. If we increase how many samples we commit to the buffer, we can increase the minimum time between writes in the consumer loop.
    3. Write samples more often.
    This probably isn't as feasible. If anything, you should probably have a short "Wait" function in the consumer loop where the DAQmx write is occurring, just to regulate loop timing and give the CPU some breathing space.
    4. Reduce the sample rate.
    Definitely not a feasible option for your application, so we'll just skip that one.
    5. Use DMA instead of interrupts.
    I'm 99.99999999% sure you're already using DMA, so we'll skip this one also.
    6. Reduce the number of concurrent apps on the PC.
    This is to make sure that the CPU time required to maintain good loop rates isn't being taken by, say, an antivirus scanner or something. Generally, if you don't have anything major running other than LabVIEW, you should be fine.
    I think our best bet is to increase the "Samples to Write" quantity (to increase the minimum loop period), and possibly to delay the DAQmx Start Task and consumer loop until the producer loop has had a chance to build the queue up a little. That should reduce the chance that the DAQmx task will empty the system buffer and ensure that we can prime the queue with a large quantity of samples. The consumer loop will wait for elements to become available in the queue, so I have a feeling that the file read may be what is slowing the program down. Once the queue empties, we'll see the DAQmx error surface again. The only real solution is to load the file to memory farther ahead of time.
    Hope that helps!
    Caleb Harris
    National Instruments | Mechanical Engineer | http://www.ni.com/support

  • Problems viewing large tif files in preview

    i'm having trouble viewing large tif files in preview. the tif files are on a cdrom - when i click on the file to open it, it opens only the first 250 pages of the file. the file has approximately 3,000 pages. how can i get preview to open the rest of the pages?
    thanks for any suggestions.
    mac mini   Mac OS X (10.4.6)  

    no trick - i didn't create the cdrom but it only has 3 tif files with approximately 3000 pages each, not 3000 large tif files. plus several smaller pdf files, but those aren't giving me any problems.
    i don't know whether they're compressed, but i still can't get more than the first 250 pages to open, even after copying the file to my desktop. if anyone has any other ideas, i'd much appreciate it.
    mac mini   Mac OS X (10.4.6)  

  • How can I send a large video file from my iPhone 4?

    How can I send (by text) a (fairly) large video file from my iPhone 4 without it getting compressed first?
    If that's not possible is there a way to uncompress the video file one it is received onto the receiving iPhone?
    Or is there a way to sync or transfer a video file from iPhone to iPhone?

    ExFAT would be the best file system format, as it will handle files greater than 4GB.
    If exFAT is not available, go for FAT32.
    Just FAT is too limiting, so avoid that

  • Error in generating a new XSL Transformer from large xslt File

    Good day to all,
    Currently I am facing a problem that whenever i try generating a Transformer object from TransformerFactory, I will have a TransformerConfigurationException threw. I have did some research from the net and understand that it is due to a bug that JVM memory limit of 64kb. However is there any external package or project that has already addressed to this problem? I have checked apache but they already patch the problem in Xalan 2.7.1. However I couldn't find any release of 2.7.1
    Please help
    Regards
    RollinMao

    If you have the transformation rules in a separate XSLT file, then, you can use com.icl.saxon package to get XML files transformed. I have used this package with large XSL files and has worked well.

  • Is there a way to import large XML files into HANA efficiently are their any data services provided to do this?

    1. Is there a way to import large XML files into HANA efficiently?
    2. Will it process it node by node or the entire file at a time?
    3. Are there any data services provided to do this?
    This for a project use case i also have an requirement to process bulk XML files, suggest me to accomplish this task

    Hi Patrick,
         I am addressing the similar issue. "Getting data from huge XMLs into Hana."
    Using Odata services can we handle huge data (i.e create schema/load into Hana) On-the-fly ?
    In my scenario,
    I get a folder of different complex XML files which are to be loaded into Hana database.
    Then I gotta transform & cleanse the data.
    Can I use oData services to transform and cleanse the data ?
    If so, how can I create oData services dynamically ?
    Any help is highly appreciated.
    Thank you.
    Regards,
    Alekhya

  • Copying large video files to PC-external HD

    I need to deliver a large Quicktime-file (5,87 Gb) edited in FCE to someone who will open it on his PC in Avid.
    It seems to be complicated:
    First problem is conversion - I need to convert my QT-file to a Windows Media file, right?
    This can be done with a conversion programme, i.e. Microsoft Expression Encoder that I have installed on a PC laptop.
    But how to move the 5,87 Gb file to the PC?
    I have an external HD formatted FAT32, but that only takes files up to 4Gb max.
    Is there really no way to copy a large mov.file on to an external hard drive that can be read by a PC?
    Tog o around the problem, I have tried to make an avi.file by exporting from FCE with Quicktime conversion - but the quality is really bad.
    Also making a smaller QT-file (that does not exceed 4 Gb limit on the HD) does not look good when compressed a second time in Expression to a Windows Media file.
    So, what to do?
    Can PC's read NFTS-formatted HD??  I guess not, I tried onc,e but maybe I didn't format it right.
    Sorry for all these questions in one go - hopefully there's a way to work around this annoying problem...

    For file transfer, here's what I suggest (sort of in order of recommendation):
    Use a network connection. Both computers (your Mac & your friend's pc) on the same network; turn on file sharing on your Mac, log the pc on to your Mac and copy the file(s).
    Use an FTP connection via the internet. You and your friend will both need an FTP client.
    Use an NTFS formatted external hard drive. All modern pcs use NTFS (Windows 2000/XP/Vista/7).  You will need a copy of NTFS for Mac (or similar utility) on your Mac to write to an NTFS formatted disk.
    Use a Mac OS Extended formatted external hard drive.  Your friend will need a copy of MacDrive (or similar utility) on his pc to read files from a Mac OS Extended formatted disk.
    Burn your 5.87GB file to a dual-layer DVD disk. Use the Finder to copy the file to DVD media & burn the disk.  Your Mac will need to be able to burn to dual-layer DVD media; and your friend's pc will need to have a DVD drive to read the disk.

  • Large Data file problem in Oracle 8.1.7 and RedHat 6.2EE

    I've installed the RedHat 6.2EE (Enterprise
    Edition Optimized for Oracle8i) and Oracle
    EE 8.1.7. I am able to create very large file
    ( > 2GB) using standard commands, such as
    'cat', 'dd', .... However, when I create a
    large data file in Oracle, I get the
    following error messages:
    create tablespace ts datafile '/data/u1/db1/data1.dbf' size 10000M autoextend off
    extent management local autoallocate;
    create tablespace ts datafile '/data/u1/db1/data1.dbf' size 10000M autoextend off
    ERROR at line 1:
    ORA-19502: write error on file "/data/u1/db1/data1.dbf", blockno 231425
    (blocksize=8192)
    ORA-27069: skgfdisp: attempt to do I/O beyond the range of the file
    Additional information: 231425
    Additional information: 64
    Additional information: 231425
    Do anyone know what's wrong?
    Thanks
    david

    I've finally solved it!
    I downloaded the following jre from blackdown:
    jre118_v3-glibc-2.1.3-DYNMOTIF.tar.bz2
    It's the only one that seems to work (and god, have I tried them all!)
    I've no idea what the DYNMOTIF means (apart from being something to do with Motif - but you don't have to be a linux guru to work that out ;)) - but, hell, it works.
    And after sitting in front of this machine for 3 days trying to deal with Oracle's, frankly PATHETIC install, that's so full of holes and bugs, that's all I care about..
    The one bundled with Oracle 8.1.7 doesn't work with Linux redhat 6.2EE.
    Don't oracle test their software?
    Anyway I'm happy now, and I'm leaving this in case anybody else has the same problem.
    Thanks for everyone's help.

  • Large excel files won't combine to pdf in Acrobat X or XI

    I have large Excel files (0.3 to 1.2 MB) that I used to combine into a single pdf.  It will only work now on Excel files smaller than 1MB.  This was working a few weeks ago.  What happened?  I have tried on computers that have both Acrobat X and XI (we have both scattered around our building).
    Any ideas?  Thanks.

    FileOpen is a security add-on, which is needed to open some files. Not a part of Adobe software. Maybe you need to upgrade or report it: http://www.fileopen.com/

  • How to increase performance speed of Photoshop CS6 v13.0.6 with trasformations in LARGE image files (25,000 X 50,000 pixels) on IMac3.4 GHz Intel Core i7, 16 GB memory, Mac OS 10.7.5?   Should I purchase a MacPro?

    I have hundreds of these large image files to process.  I frequently use SKEW, WARP, IMAGE ROTATION features in Photoshop.  I process the files ONE AT A TIME and have only Photoshop running on my IMac.  Most of the time I am watching SLOW progress bars to complete the transformations.
    I have allocated as much memory as I have (about 14GB) exclusively to Photoshop using the performance preferences panel.  Does anyone have a trick for speeding up the processing of these files--or is my only solution to buy a faster Mac--like the new Mac Pro?

    I have hundreds of these large image files to process.  I frequently use SKEW, WARP, IMAGE ROTATION features in Photoshop.  I process the files ONE AT A TIME and have only Photoshop running on my IMac.  Most of the time I am watching SLOW progress bars to complete the transformations.
    I have allocated as much memory as I have (about 14GB) exclusively to Photoshop using the performance preferences panel.  Does anyone have a trick for speeding up the processing of these files--or is my only solution to buy a faster Mac--like the new Mac Pro?

  • Help with Canon G9 RAW file in CS2

    I have CS2, and I recently purchased a Canon G9. I noticed that there is an update for Camera RAW that listed the G9 in it. But the update for CS2 doesn't. Can someone point me to how I can read G9 RAW file in CS2?
    Thanks.

    Use the most recent DNG Converter (version 4.3.1 part of the Camera Raw 4.3.1 download) and convert the G9 files to DNG. Course, with this nice new camera, it would be a real shame to be stuck on Camera Raw 3.x cause Camera Raw 4.x (and Photoshop CS3) is a powerful and worthwhile update.

  • Can't download large pdf file

    I'm trying to download a large pdf file (26.2 mb) to my iPad.  It appears to download, then appears, as 440 blank pages.  It offers to open the file in iBook, but that also doesn't have any results.  Is there a limit on the size of the pdf file it can handle?  I tried to e-mail it from my computer, but the file is too big for my e-mail program.  This is a book I would really love to read on the iPad--how can I get it there?  thanks, B.

    Yes, I see that as well. Talk about a bad assumption. I thought from what I had read that the feature worked like Spotlight on the Mac. In fact I could swear I've read that on Apple's site somewhere. Not the first time I was wrong and certainly not the last. Alas I stand corrected again.
    Frankly, after investigating the search function further - IMHO - it seems pretty useless.  A number of the apps that I use have their own search functions built in. I've read that you can search for a missing app on your iPad with this feature, but other than that and finding contacts and emails it seems pretty pointless to me.

  • Is there a good way to add a large music file to itunes match without itunes crashing?

    I have several large music files that are DJ sessions and anywhere from 1-3 hours long. I deleted them from itunes while matching because they were causing match to crash at step 3 every time. I would like to add these files into match now but every time I try, it crashes again. Anyone know of a work around for this?

    The iTunes Match service is designed to exclude such Songs.  Perhaps handling them as Podcasts would help you.

  • I need to sort very large Excel files and perform other operations.  How much faster would this be on a MacPro rather than my MacBook Pro i7, 2.6, 15R?

    I am a scientist and run my own business.  Money is tight.  I have some very large Excel files (~200MB) that I need to sort and perform logic operations on.  I currently use a MacBookPro (i7 core, 2.6GHz, 16GB 1600 MHz DDR3) and I am thinking about buying a multicore MacPro.  Some of the operations take half an hour to perform.  How much faster should I expect these operations to happen on a new MacPro?  Is there a significant speed advantage in the 6 core vs 4 core?  Practically speaking, what are the features I should look at and what is the speed bump I should expect if I go to 32GB or 64GB?  Related to this I am using a 32 bit version of Excel.  Is there a 64 bit spreadsheet that I can us on a Mac that has no limit on column and row size?

    Grant Bennet-Alder,
    It’s funny you mentioned using Activity Monitor.  I use it all the time to watch when a computation cycle is finished so I can avoid a crash.  I keep it up in the corner of my screen while I respond to email or work on a grant.  Typically the %CPU will hang at ~100% (sometimes even saying the application is not responding in red) but will almost always complete the cycle if I let it go for 30 minutes or so.  As long as I leave Excel alone while it is working it will not crash.  I had not thought of using the Activity Monitor as you suggested. Also I did not realize using a 32 bit application limited me to 4GB of memory for each application.  That is clearly a problem for this kind of work.  Is there any work around for this?   It seems like a 64-bit spreadsheet would help.  I would love to use the new 64 bit Numbers but the current version limits the number of rows and columns.  I tried it out on my MacBook Pro but my files don’t fit.
    The hatter,
    This may be the solution for me. I’m OK with assembling the unit you described (I’ve even etched my own boards) but feel very bad about needing to step away from Apple products.  When I started computing this was the sort of thing computers were designed to do.  Is there any native 64-bit spreadsheet that allows unlimited rows/columns, which will run on an Apple?  Excel is only 64-bit on their machines.
    Many thanks to both of you for your quick and on point answers!

Maybe you are looking for

  • Is there graphical equalizer in apple tv

    I am thinking of buying apple tv..but i wanna really know that it has a graphical equalizer or not. coz i really need it for better audio quality.. could u please answer me? null

  • MAX behaving strangely...

    I'm trying to do what at first I thought would be a very simple query: I am extracting a list of course completions, and need to join to an 'order' and 'order_items' tables (think shopping cart - one order record relates to many order_items records)

  • Update phone number

    hi, I am not able to update "My Number" under "Phone\Settings" am using iPhone 4 anybody know what shall I do?

  • DSO Request Activation

    Hi All: When trying to activate a successfully loaded request in an ODS, the job cancels. When I check the Job Logs, it mentions error "SAPSQL_EMPTY_TABNAME", Exception "CX_SY_DYNAMIC_OSQL_SYNTAX". Short Text "A dynamically specified FROM clause has

  • How to align eyes in photo montage ?

    I'm doing a "daily photo" montage of me and have more than 500 photos of my face. I just want my eyes to align perfectly in the different photos so when I play them fast, my face is constantly well positioned and blend perfectly with the other photos