File corruption with a very large .ai file

A little background first:
I am a graphic designer/cartographer with 15+ years of experience. I started making maps in Illustrator with version 6 and have upgraded to every version since.
My machines:
2x Mac Pro 8-core 3.0GHz, 16GB RAM, 10.5.7
Mac Pro quad-core 2.66GHz, 8GB RAM, 10.5.7
MacBook Pro 2.0GHz, 2GB RAM, 10.5.7
Illustrator specs:
All machines have CS4 installed as well as Illustrator 10.
The 8-core MPs have the MAPublisher Plug-ins installed
The 4-core and MacBook Pro (MBP) does not have the MAPublisher Plug-ins
The problem I am having can be replicated on each of the machines. The MBP can't handle the file due to RAM. Since this occurs on machines that has MAPublisher installed and a machine that does not, I think we can rule out a plug-in issue.
File specs:
The original file: version 10, file size (uncompressed, no PDF support, and no font-embedding) is 36.4 MB. There are no raster effects or embedded/placed images. This is strictly a vector file. Artboard Dimensions: 85.288 in x 81.042 in
The original file, converted with CS4, and then saved as a CS4 file: file size (uncompressed, no PDF support, and no font-embedding) is 97.9 MB.
Brief Description of the problem:
I have tried to convert this file into every version of CS and it has failed every time. With each version, it has resulted in an unusable file for different reasons. CS-CS3, the file was completely unusable because of the opening/saving time. It could take as long as 3 hours to save the file. With CS4, this has been rectified and I once again tried to convert it. Upon re-opening of the 'converted' CS4 native file, the file is 'corrupted'.
The file corruption is not your regular "This file can't be opened because of: X" corruption. The file opens after a save/close just fine. It is just that parts of the file gets destroyed. To save space in this post, I have created a webpage that illustrates the problem that I am having:
http://newatlas.com/ai_problem/
I have tried everything possible to make the file smaller and it is as slimmed down as I can make it. (Using symbols, styles, etc.) I have also tried to eliminate this as a font problem by replacing every font with an Adobe supplied font, cleared caches, etc. This does not work, so I think we can rule out a font issue. I have also reduced this file to contain no pattern fills, no gradients, and just used simple fills and strokes. All to no avail. I have also tried piecing the file back together into a new document by copying/pasting a layer at a time. Saving, closing and re-opening after each paste cycle. I can get about 95% of it put back together and then it will manifest the problem. The only thing I haven't done is to convert all of the type to outlines. This would not solve my problem since this is a map that I continually work on year after year. I also can't remove objects or cut the overall area of the map because this file is used to produce an atlas book, a wall map and custom boundary wall maps. You can view the entire file at:
http://okc.cocpub.com
If I do not convert the legacy text, the file saves/closes/re-opens just fine. It just takes a very long time. So this leads me to think that the cause of the problem is the number of editable type objects that this file has. Ever since Adobe changed the Type Engine, I haven't been able to use this file in current versions of Illustrator.
If I could get this file to open, uncorrupted, I could finally get rid of Illustrator 10. Illustrator 10 does not have any problem with this file (and is still faster than CS4 in everything except selecting a lot of objects.)
I am posting this on the forums for any other opinions/ideas from the 'Illustrator Gurus' as a first step. I want to get in contact with someone at Adobe to see if we can address this problem and possibly get it fixed with CS5. I know that this is a user-to-user forum, but I'm not sure who, where and how to contact Adobe for this issue. Maybe someone on these forums can help with that as well.
Thank you for your patience for getting this far in my long post and I would really appreciate any response.
Dave

Thanks Wade for responding,
Did you try trashing your Adobe Illustrator CS4 Settings folder in your User's Preferences?
Yes, I've tried deleting prefs. Basically I've tried to rule out any problems with Illustrator as a whole. This issue has also occured on a clean install of OS X and Illustrator on a new User Account with the opening of this file being the first task that Illustrator has. There is no problem with Illustrator per se, but I think it is more of a limitation in Illustrator based on the number of type objects.
You could also try saving it out of 10 as a pdf or as postscript and distilling it then open that in ai or place it id a blank AI document.
Did you try to place instead of opening it?
I haven't tried any of these since the resulting file would be utterly unusable. Basically this would create a 'flat' file with 'broken' strings of text (type on a path especially) and type being uneditable. (Now that I think about it, CS4 does a much better job of opening pdfs without breaking type.) I still think this approach is not really a prudent course of action since, as of now, I can continue to maintain this map in Illy 10.
In my experimentation, the results are as follows:
1. Opening the file without updating the legacy type, saving the file as a new document, closing and then re-opening results in a file you would expect. Every object is where it is supposed to be. Downfall of this method: I absolutely need the type to be editable, especially the 'Street Type' since this type is actually used to create map indexes.
2. Opening the file with updating the legacy type, saving the file as a new document, closing and then re-opening results in a file that exhibits the exact behavior that has been posted in the thread-starter. This method results in the 'Bruce Gray' type being the duplicated item.
3. Opening the file without updating the legacy type, then splitting the file into layer groups and saving as separate files. Then opening the resulting CS4 files, updating the legacy type, copy & pasting (with layer structure) into a new document results in a usable file up to a point. I can get about 95% of it put back together and then the problem manifests. I have thought that it might be a "bad" object on one of the layers but I have ruled that out by: a.) All of the resulting sub-files (files that are portions of the larger) exhibit no problems at all. Usually our PS printers find issues that Illy does not and there is no problem in RIPing the sub-files. b.) If I change the paste order, meaning copying & pasting from the top-most layers to the bottom-most layers, vice-versa, and a completely random paste order, different objects (other than the 'Bruce Gray' type) will be duplicated. I've had one of my park screens, a zip code type object and a school district boundary be the duplicated object.
All of these experiments has lead me to believe that the Illustrator Type Engine is the main facilitor. I just don't think it can handle that many individual point type objects. I know CS4 can handle the number of objects, based on the fact that a legacy type (non-updated) file works.
I am almost entirely sure that Illustrator is working exactly as it is supposed to and that the vast majority of Illy users will never run into this issue. This file is by far the largest file that I work on. I would just like to be able to use an Intel native version of CS to continue maintaining this map.
On a side note: About three years ago, I tried working with this file in Freehand MX. Freehand initially would open the Illy file without a problem. I could work on it but when I would save it as a Freehand file, close it and re-open it, I would get your standard File Corruption. It would partially open, give me a corruption dialog, and open the file as a blank document. I alwa ys knew there was a reason to use Illustrator over Freehand for making maps.

Similar Messages

  • Memory consumption for an ASO app with a very large outline

    Solaris 10 running Essbase 9.3.1.3.08 64-bit
    We have an app with a very large outline (3Gb). When we do a dimension restructure, it takes about 30 minutes which I suppose is reasonable. However, it takes up about 13Gb RAM to do this and doesn't release the memory when finished unless we restart the app.
    Is this normal? It seems an incredible amount of RAM to use, even for such a large outline and it doesn't release it when done.
    The box has 32Gb RAM.

    I think it was version 9.3.1.3 that outline compaction started working. The first thing I would try is to compact the outline. The MaxL statement for it is Alter database xxxx.yyyy compact outline;
    ASO Outlines tend to grow with every change, do you really have that many members and attributes that would make that big an outline. If it keeps growing at some point it will corrupt

  • Help with a Very Large File, on a Large VDisk, On a Bad Sector

    I recently took over as the Sys Admin for a small office.  I found recently that the scheduled backups for on of our shared disks was failing and we had no other backups of this data.  What I found was NTBackup was failing when it accessed a couple
    of files on this share.
    I would like to run CHKDSK on this Virtual Disk but prefer to have at least one good backup 1st.  I tried excluding the files in question in the BKS file but it still references them in the log file and still fails.
    Regardless of how I end up getting my backup, this is the question I have:
    Is there a way to scan and recover these files w/o running CHKDSK on the entire volume?
    System is Windows Server 2003 (Soon to be upgraded!).  The VDisk is a 12 TB volume and the files in question are over 250GB each.  This volume has many large files in the GB range.
    When it is working a backup takes close to 3-days...I suspect a CHKDSK will be similar.
    Any Help is Appreciated,
    WKCook
    Keith

    Hi Keith,
    What type of files cannot be backed up? If they are system files, you could run
    sfc /scannow on the server to repair the Windows corrupted system files.
    Use the System File Checker tool to repair missing or corrupted system files
    https://support.microsoft.com/en-us/kb/929833
    Best Regards,
    Mandy
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Rebuild iTunes with Just VERY LARGE iTunes Music Folder

    Hello,
    As much as I tried to back every bit of my iTunes database up regularly, I failed to assure the quality of these files. Well, I had a family member accidentally delete the iTunes Library and iTunes Library.xml files (yup, my mistake to allow them access to my Mac in the first place). It is only now that I realize the most recent backup copy of these files are corrupted. The really good news here is that I have multiple intact copies of my entire iTunes Music Folder. My goal is to recreate my iTunes Library and iTunes Library.xml files to accommodate it.
    Now, here is the kicker. My database is 38000 songs and 190 GB large! I certainly know how to add songs iTunes, but it seems as though it chokes with large projects like this and misses random parts of the tag information. So, what is the best way to get me out of this hole, with the least amount of trouble and most efficient restorative actions? While I wait for an answer, I am also researching various other options and digging deep into the root to rescue deleted files.
    If we learn from our mistakes, then I obtained my PhD years ago,
    Dr. Z.

    Dr. Z,
    My iTunes library is about the same size. I am experiencing a little different problem though. At times when I start iTunes up I get a message stating that my library is corrupt and it then tries to restore it from the xml file. It finishes, but doesn't get every song. I then have to select all song and clear them while keeping the files intact on the hard drive. I then have to add my iTunes music folder and all its contents using the iTunes "Add to Library" function. I gets all the songs, but of course now all the dates, when the songs were added to the library, have the current day's date. I have to repeat this whole scenario about three times a week. Grrrrr...
    It seems like iTunes isn't capable of handling large amounts of songs. I'm sure it shoud be able to, but it sure isn't at the moment.
    As to your problem, what if you try adding your songs a chunck at a time instead of all at once? Maybe that will keep your data intact. Just a thought. Might work. I've seen stranger things work.
    G5 2.5gHz PowerMac w/DVD-RW Drive, G3 400mHz Powerbook   Mac OS X (10.4.3)  

  • Very large file upload 2 GB with Adobe Flash

    Hello, anyone know how can I upload very large files with Adobe Flash or how to use SWFUpload?
    Thanks in Advance, All help will be very appreciated.

    1. yes
    2. I'm getting error message from php
    if( $_FILES['Filedata']['error'] == 0 ){ 
      if(move_uploaded_file( $_FILES['Filedata']['tmp_name'], $uploads_dir.$_FILES['Filedata']['name'] ) ){ 
      echo 'ok'; 
      exit(); 
      echo 'error';    // Overhere
      exit();

  • Best data Structor for dealing with very large CSV files

    hi im writeing an object that stores data from a very large CSV file. The idea been that you initlize the object with the CSV file, then it has lots of methods to make manipulating and working with the CSV file simpler. Operations like copy colum, eliminate rows, perform some equations on all values in a certain colum, etc. Also a method for prining back to a file.
    however the CSV files will probly be in the 10mb range maby larger so simply loading into an array isn't posable. as it produces a outofmemory error.
    does anyone have a data structor they could recomend that can store the large amounts of data require and are easly writeable. i've currently been useing a randomaccessfile but it is aquard to write to as well as needing an external file which would need to been cleaned up after the object is removed (something very hard to guarentee occurs).
    any suggestions would be greatly apprechiated.
    Message was edited by:
    ninjarob

    How much internal storage ("RAM") is in the computer where your program should run? I think I have 640 Mb in mine, and I can't believe loading 10 Mb of data would be prohibitive, not even if the size doubles when the data comes into Java variables.
    If the data size turns out to be prohibitive of loading into memory, how about a relational database?
    Another thing you may want to consider is more object-oriented (in the sense of domain-oriented) analysis and design. If the data is concerned with real-life things (persons, projects, monsters, whatever), row and column operations may be fine for now, but future requirements could easily make you prefer something else (for example, a requirement to sort projects by budget or monsters by proximity to the hero).

  • Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    I think more likely, it's a memory / scratch disk issue.  1.25 gigabytes is a very big image file!!
    Nancy O.

  • I have an old mpg video file, taken with a very early model "smart phone", that opens and plays fine in my newly "rebuilt" iPhoto library. I cannot open, edit it or share it with any other software. How can I fix it, to be able to do anything with it?

    I have an old mpg video file, taken with a very early model "smart phone", that opens and plays fine in my newly "rebuilt" iPhoto library. I cannot open, edit it or share it with any other software. How can I fix it, to be able to do anything with it?
    Detail:
    I've recently purchaced a 4TB Thunderbolt drive to store my "vast" music and photo libraries. iPhoto had an issue reading the moved library, so I bought and used iPhoto Library Manager to "rebuild" it. Apart from much of the original data such as date taken & camera used etc, it appears to be working well. The aforementioned mpg video was taken some 9 years ago, using an early model "iMate" smart phone, and opens and plays fine on iPhoto, but I cannot open it with anything else, (I've tried iMovie, VLC players, Wondershare and Handbrake) nor can I share it. I just want to edit it, and share it with family.
    Any help would be appreciated...

    No - not iMovie, VLC, Wondershare or Handbrake... Quick time starts with a "CONVERTING", then I get
    I've looked at the "tell me more" links, tried downloading some of the movie players there. I'm begining to think the file is corrupt.
    Thanks for getting back to me again though - appreciate it...

  • Slow Performance or XDP File size very large

    There have been a few reports of people having slow performance in their forms (tyically for Dynamic forms) or file sizes of XDP files being very large.
    These are the symptoms of a problem with cut and paste in Designer where a Process Instruction (PI) used to control how Designer displays a specific palette is repeated many many times. If you look in your XDP source and see this line repeated more than once then you have the issue:
    The problem has been resolved by applying a style sheet to the XDP and removing the instruction (until now). A patch has been released that will fix the cut and paste issue as well as repair your templates when you open them in a designer with the patch applied.
    Here is a blog entry that describes the patch as well as where to get it.
    http://blogs.adobe.com/livecycle/2009/03/post.html

    My XDP file grow up to 145mb before i decided to see what was actually happening.
    It appears that the LvieCycle Designer ES program sometimes writes alot of redundant data... the same line millions of times over & over again.
    I wrote this small java program which reduced the size up to 111KB !!!!!!!!!!!!!!!!!! (wow what a bug that must have been!!!)
    Here's the sourcecode:
    import java.io.BufferedReader;
    import java.io.BufferedWriter;
    import java.io.FileNotFoundException;
    import java.io.FileReader;
    import java.io.FileWriter;
    import java.io.IOException;
    public class MakeSmaller {
    private static final String DELETE_STRING = "                           <?templateDesigner StyleID aped3?>";
    public static void main(String... args) {
      BufferedReader br = null;
      BufferedWriter bw = null;
      try {
       br = new BufferedReader(new FileReader(args[0]));
       bw = new BufferedWriter(new BufferedWriter(new FileWriter(args[0] + ".small")));
       String line = null;
       boolean firstOccurence = true;
       while((line = br.readLine()) != null) {
        if (line.equals(DELETE_STRING)) {
         if (firstOccurence) {
          bw.write(line + "\n");
          firstOccurence = false;
        } else {
         bw.write(line + "\n");
         firstOccurence = true;
      } catch (FileNotFoundException e) {
       e.printStackTrace();
      } catch (IOException e) {
       e.printStackTrace();
      } finally {
       if (br != null) {
        try {
         br.close();
        } catch (IOException e) {
         e.printStackTrace();
       if (bw != null) {
        try {
         bw.close();
        } catch (IOException e) {
         e.printStackTrace();
    File that gets generated is the same as the xdp file (same location) but gets the extension .small. Just in case something goes wrong the original file is NOT modified as you can see in the source code. And yes Designer REALLY wrote that line like a gazillion times in the .xdp file (shame on the programmers!!)
    You can also see that i also write the first occurrence to the small file just in case its needed...

  • Upload very large file to SharePoint Online

    Hi,
    I tried uploading very large files to SharePoint online via the REST API using JAVA. Uploading files with a size of ~160 mb works well, but if the filesize is between 400mb and 1,6 gb I either receive a Socket Exception (Connection reset by peer) or an error
    of SharePoint telling me "the security validation for the page is invalid". 
    So first of all I want to ask you, how to work around the security validation error. As far as I understand the token, which I have added to my X-RequestDigest header of the http post request seems to be expired (by default it expires after 1800 seconds).
    Uploading such large files is time consuming, so I can't change the token in my header while uploading the file. How can I extend the expiration time of the token (if it is possible)? Could it help to send post requests with the same token
    continuously while uploading the file and therefore prevent the token from expiration? Is there any other strategy to upload such large files which need much more than 1800 seconds for upload?
    Additionally any thoughts on the socket exception? It happens quite sporadically. So sometimes it happens after uploading 100 mb, and the other time I have already uploaded 1 gb.
    Thanks in advance!

    Hi,
    Thanks for the reply, the reason I'm looking into this is so users can migrate there files to SharePoint online/Office 365. The max file limit is 2gb, so thought I would try and cover this.
    I've looked into that link before, and when I try and use those end points ie StartUpload I get the following response
    {"error":{"code":"-1, System.NotImplementedException","message":{"lang":"en-US","value":"The
    method or operation is not implemented."}}}
    I can only presume is that these endpoints have not been implemented yet, even though the documentation says they only available for Office 365 which is what  I am using. 
    Also, the other strange thing is that I can actually upload a file of 512mb to the server as I can see it in the web UI. The problem I am experiencing is get a response, it hangs on this line until the TimeOut is reached.
    WebResponse wresp = wreq.GetResponse();
    I've tried flushing and closing the stream, sendchunked but still no success

  • Photoshop CS2 file corruption with Leopard 10.5.3

    In addition to the network server corruption issue being experienced by everyone with CS3 and OS X 10.5.3, since Wednesday my CS2 version 9 Photoshop files are turned into corrupted, flattened Mexican rug files when I save to my own G5 pre-Intel hard drive. This is specifically happening when I save files to my hard drive and then shut down the computer minutes later. When I restart minutes later, the files saved just prior to shutdown are flattened and corrupted into a very colorful, horizontally banded Mexican rug. Which would be nice if I was designing Mexican rugs, but I'm not. Files saved during the day with no shutdown appear to be okay.
    This has happened twice since Wednesday.
    I also noticed that when I shut down since the Leopard 10.5.3 update, things won't shut down properly and hang. Can't figure out if this is Photoshop, iTunes (it wouldn't force quit), OS X, or a combination. So I have to shut down the power. Both times when I've done this, it resulted in corrupted files, even though their saves were completed prior to shutdown.
    Does anyone know if this is an Apple thing or an Adobe thing? I wonder if this is one of the "issues for which there is no resolution" that Adobe says can happen using CS2 on Leopard.
    Anybody have any solutions to this, other than going back to 10.5.2? I've been backing up multiple copies of files at different times of the day in different places. I don't know what else to do.
    I have 2.5 GB of RAM, 96 GB of free disk space, NVIDIA GeForce 6800 Ultra DDL.

    Wendell,
    If your system is hanging upon shutdown, and files being saved immediately before shutdown are corrupted, then I'd say that the save process is not complete when you are shutting down. Otherwise it makes no sense, as your system does not know that you are going to shut down at that point.
    Have you done system maintenance? And immediately before you upgraded to Mac OS X v10.5.x did you boot off the System disc and repair permissions and your drive? And did you boot off Alsoft's DiskWarrior and do the same? And then repeat the permissions repair immediately after the upgrade? Did you reinstall your apps from their original discs? If running maintenance now does not repair the issues AND if your system was behaving properly before, you may want to do an archive and restore to Mac OS X v10.4.11 until Apple has a further fix for Mac OS X v10.5.x.
    Note that many of us who have pre-Intel machines that do not require Mac OS X v10.5.x have not upgraded -- Mac OS X v10.4.x is very stable. It is compatible with all CS3 apps (except as noted on the box by Adobe).
    ===
    >Don't fob your customers off. This is HUGE BUG - I've paid for my software so fix it!
    Carla,
    I understand your frustration -- you are not alone. But, please post your questions or concerns just once. I've deleted your duplicate post.
    Neil

  • Photoshop CS2 file corruption with Leopard 10.5.3 update

    In addition to the network server corruption issue being experienced by everyone with CS3 and OS X 10.5.3, since Wednesday (and the 10.5.3 update) my CS2 version 9 Photoshop files are turned into corrupted, flattened Mexican rug files when I save to my own G5 pre-Intel hard drive. This is specifically happening when I save files to my hard drive and then shut down the computer minutes later. When I restart minutes later, the files saved just prior to shutdown are flattened and corrupted into a very colorful, horizontally banded Mexican rug. Which would be nice if I was designing Mexican rugs, but I'm not. Files saved during the day with no shutdown appear to be okay.
    This has happened twice since Wednesday.
    Please note that my Photoshop "save" is completed BEFORE I shut down, by several minutes.
    Also, in both cases, an application was hanging and wouldn't force quit, which initiated my shutting down. First it was iTunes, then it was Mail. So I can't figure out if this is Photoshop, iTunes (it wouldn't force quit once), OS X, or a combination. But the only thing I can do is shut down the power.
    Does anyone know if this is an Apple thing or an Adobe thing? I wonder if this is one of the "issues for which there is no resolution" that Adobe says can happen using CS2 on Leopard. Or part of the Leopard update. Or both cooperating together.
    Anybody have any solutions to this, other than going back to 10.5.2 (or Tiger) ? I've been backing up multiple copies of files at different times of the day in different places. I don't know what else to do. Can't work much longer with corruption over my head. Leopard has been one long big headache.
    I have 2.5 GB of RAM, 96 GB of free disk space, NVIDIA GeForce 6800 Ultra DDL.

    The problem, as I see it, is; "which comes first - the chicken or the egg?" To explain, I am currently operating on an iMac G5, running OS X 10.3.9...... and I have Adobe Creative Suite 2 Preminum installed on my computer. Okay now my question..or rather questions (plural) are; #1 If I upgrade my operating system to OS X 10.4. (Tiger) is Adobe CS2 (Photoshop, Illustrator, InDesign, etc) compatible with OS X 10.4?
    * Okay let's supposed the answer to #1 is; "yes - it's compatible.....CS2 will run fine on OS X 10.4
    #2 Next question - Will Adobe CS2 run (without any problems) if I were to upgrade to OS X 10.5 (Leopard)? I believe, from what I have been reading the answer to this is; "No, Adobe CS2 is not compatible with Mac OS X 10.5". Okay let's say that I would then have to plunk the bucks and upgrade to Adobe CS3. On to the next and final question........
    #3 Okay if I were to upgrade my OS to OS X 10.5 what do I upgrade first??? The operating system (which is not compatible with Adobe CS2 or the Adobe CS3 suite (which is not compatible with my present OS (OS 10.3.9)??? I have found nothing at either Adobe or Apple web sites which addresses this specific issue. I guess it's just like people - when your systems and software become outdated, the **** with you.
    Like I said; Which comes first - the chicken or the egg?
    PS: from what I have read on this thread it appears that people really don't read what someone is saying.....you (LoobyDooby) were taking about CS2 and suddenly everyone else stated taking about CS3. It's like tech support........you ask about "apples" and they tell you that "oranges" are not in season.......huh???
    Message was edited by: Roger Beltz

  • Setting resolution, deciding file type, for very LARGE Canvas prints. 36MP camera.

    Okay, so I noticed my lightroom was on 240 PPI resolution. I changed it to 300 because I read 300 was for standard prints. What would I need to set the export to in the resolution box for a very large Canvas?
    Is it better to choose Tiff instead of Jpeg for prints of this quality, if not then what should I choose?
    I am using a Sony A7R full frame 36.4MP and with some of the sharp Zeiss lens there is really no pixelation that noticeable when I zoom in 100 percent. Of course the A7R is being said to be one of the best Sensors on the market today. It's supposed to be like the Nikon D800E, but apparently it has some advantages.
    In other words, I want to export in as high of quality as possible for the Canvas. File size is not an issue.

    Changing the resolution setting does absolutely nothing to the digital image. This is a common misconception. The only thing that counts in the digital image is the pixel dimensions. Regardless of what the PPI setting is (240, 300, 600, whatever) the image still has the same number of pixels. To determine what you need for a print of any size it is necessary to multiply the inches by the desired pixels per inch. Suppose you want a 16 x 20" print at 300 pixels per inch. The math would be something like this:
    300x16 = 4800 pixels
    300x20 = 6000 pixels
    So to print a 16 x 20" print you would need an image that is 4800 x 6000 pixels. And the PPI setting can be anything you want it to be because it has no effect on the image.

  • I need to sort very large Excel files and perform other operations.  How much faster would this be on a MacPro rather than my MacBook Pro i7, 2.6, 15R?

    I am a scientist and run my own business.  Money is tight.  I have some very large Excel files (~200MB) that I need to sort and perform logic operations on.  I currently use a MacBookPro (i7 core, 2.6GHz, 16GB 1600 MHz DDR3) and I am thinking about buying a multicore MacPro.  Some of the operations take half an hour to perform.  How much faster should I expect these operations to happen on a new MacPro?  Is there a significant speed advantage in the 6 core vs 4 core?  Practically speaking, what are the features I should look at and what is the speed bump I should expect if I go to 32GB or 64GB?  Related to this I am using a 32 bit version of Excel.  Is there a 64 bit spreadsheet that I can us on a Mac that has no limit on column and row size?

    Grant Bennet-Alder,
    It’s funny you mentioned using Activity Monitor.  I use it all the time to watch when a computation cycle is finished so I can avoid a crash.  I keep it up in the corner of my screen while I respond to email or work on a grant.  Typically the %CPU will hang at ~100% (sometimes even saying the application is not responding in red) but will almost always complete the cycle if I let it go for 30 minutes or so.  As long as I leave Excel alone while it is working it will not crash.  I had not thought of using the Activity Monitor as you suggested. Also I did not realize using a 32 bit application limited me to 4GB of memory for each application.  That is clearly a problem for this kind of work.  Is there any work around for this?   It seems like a 64-bit spreadsheet would help.  I would love to use the new 64 bit Numbers but the current version limits the number of rows and columns.  I tried it out on my MacBook Pro but my files don’t fit.
    The hatter,
    This may be the solution for me. I’m OK with assembling the unit you described (I’ve even etched my own boards) but feel very bad about needing to step away from Apple products.  When I started computing this was the sort of thing computers were designed to do.  Is there any native 64-bit spreadsheet that allows unlimited rows/columns, which will run on an Apple?  Excel is only 64-bit on their machines.
    Many thanks to both of you for your quick and on point answers!

  • Unable to copy very large file to eSATA external HDD

    I am trying to copy a VMWare Fusion virtual machine, 57 GB, from my Macbook Pro's laptop hard drive to an external, eSATA hard drive, which is attached through an ExpressPort adapter. VMWare Fusion is not running and the external drive has lots of room. The disk utility finds no problems with either drive. I have excluded both the external disk and the folder on my laptop hard drive that contains my virtual machine from my Time Machihne backups. At about the 42 GB mark, an error message appears:
    The Finder cannot complete the operation because some data in "Windows1-Snapshot6.vmem" could not be read or written. (Error code -36)
    After I press OK to remove the dialog, the copy does not continue, and I cannot cancel the copy. I have to force-quit the Finder to make the copy dialog go away before I can attempt the copy again. I've tried rebooting between attempts, still no luck. I have tried a total of 4 times now, exact same result at the exact same place, 42 GB / 57 GB.
    Any ideas?

    Still no breakthrough from Apple. They're telling me to terminate the VMWare processes before attempting the copy, but had they actually read my description of the problem first, they would have known that I already tried this. Hopefully they'll continue to investigate.
    From a correspondence with Tim, a support representative at Apple:
    Hi Tim,
    Thank you for getting back to me, I got your message. Although it is true that at the time I ran the Capture Data program there were some VMWare-related processes running (PID's 105, 106, 107 and 108), this was not the case when the issue occurred earlier. After initially experiencing the problem, this possibility had occurred to me so I took the time to terminate all VMWare processes using the activity monitor before again attempting to copy the files, including the processes mentioned by your engineering department. I documented this in my posting to apple's forum as follows: (quote is from my post of Feb 19, 2008, 1:28pm, to the thread "Unable to copy very large file to eSATA external HDD", relevant section in >bold print<)
    Thanks for the suggestions. I have since tried this operation with 3 different drives through two different interface types. Two of the drives are identical - 3.5" 7200 RPM 1TB Western Digital WD10EACS (WD Caviar SE16) in external hard drive enclosures, and the other is a smaller USB2 100GB Western Digital WD1200U0170-001 external drive. I tried the two 1TB drives through eSATA - ExpressPort and also over USB2. I have tried the 100GB drive only over USB2 since that is the only interface on the drive. In all cases the result is the same. All 3 drives are formatted Mac OS Extended (Journaled).
    I know the files work on my laptop's hard drive. They are a VMWare virtual machine that works just fine when I use it every day. >Before attempting the copy, I shut down VMWare and terminated all VMWare processes using the Activity Monitor for good measure.< I have tried the copy operation both through the finder and through the Unix command prompt using the drive's mount point of /Volumes/jfinney-ext-3.
    Any more ideas?
    Furthermore, to prove that there were no file locks present on the affected files, I moved them to a different location on my laptop's HDD and renamed them, which would not have been possible if there had been interference from vmware-related processes. So, that's not it.
    Your suggested workaround, to compress the files before copying them to the external drive, may serve as a temporary workaround but it is not a solution. This VM will grow over time to the point where even the compressed version is larger than the 42GB maximum, and compressing and uncompressing the files will take me a lot of time for files of this size. Could you please continue to pursue this issue and identify the underlying cause?
    Thank you,
    - Jeremy

Maybe you are looking for

  • Depreciation for an Asset

    Hi Gurus, How can make a capitalised asset not depreciate or put the depreciation on hold for a year but the asset still forms part of my asset register. Please advise, Thanks, Themba

  • How do I go back to previous iTunes version?

    How might I revert to to iTunes 11 while keeping other attributes? New version seems very visually intensive and compromises my search abilities thru 80k+ library.  With this size library, compieled over a decade, Im not going to pay to store it on i

  • Different results on each execution for hostname -I (Closed)

    for i in $(seq 1 10) ; do  /usr/bin/hostname -i; sleep 5; done 118.98.96.151 118.98.96.151 192.168.1.11 118.98.96.151 118.98.96.151 118.98.96.151 118.98.96.151 118.98.96.151 118.98.96.151 118.98.96.151 192.168.1.11  is the address I expected. It is t

  • Shuts down when using QuickTime (random)

    Need some help/ideas here... My G5 Power PC dual 2.3 OS 10.5.8 shuts down (instant like pulling the plug) when using QuickTime 7.6.4 but its random... sometimes I can watch a couple hours worth sometimes it shuts down in 10 minutes. Can software caus

  • CUE: AIM2-CUE-K9 License problem

    Hi all! Ok, I´ve the following scenario: Cisco Router 2811 with CUCME 7.1 AIM2-CUE-K9 Module The CUCME is working perfectly, now i´m trying to complete the installation of the CUE, so i´m using the software available for the mentioned module at cisco