Large TIF file errors/erratic performance

Sorry for the redundancy, but I can't seem to find any clear answer on what the max file size (either dimension or MB) is before Aperture fails entirely. I'm scanning medium format on a Nikon LS-9000, which is resulting in ~8500x10500 TIF files of ~180MB (for a 16-bit grey.) Aperture will import the file, but then typically crash a number of times immediately after import while generating thumbnail/preview. I'll then generally see the thumbnail go black after making any adjustments to the image, and it will stay that way. Lightroom handles these files quite handily; I like Aperture and the process of migrating my sizeable library would be onerous at a minimum. What's even more frustrating is that iphoto '09 will readily support the same image file that seems to be choking Aperture to death.
Thanks for any advice,
Matt.

Yep,
There is no "answer" for this but I can validate your experience. My medium format scans are not viable in Aperture at 4000DPI 16bit. Way to painful when you can coax it to work and it doesn't work for very long. In other words Aperture uses so many resources trying to deal with this size file that it is either idiotically slow or then breaks.
The good news is that CS4 processes 500+ meg Tiffs way better than any other Photoshop ever did, even in 32bit mode - I cannot wait for the 64bit upgrade from Adobe when Snow Leopard comes out. I would imagine that Aperture will get the 64bit treatment sooner or later and all will be fine with massive files.
RB

Similar Messages

  • Problem opening large .Tif files

    Hi...this problem is really frustrating. Preview will not open some large .tif
    files (they're saved on Photoshop CS5 on a PC, with lzw compression).
    The program just displays a grey screen...no error message.
    When I email the images, no preview appears in Mail.
    Is this a known problem with Preview in Snow Leopard, and can I do
    anything about it?
    Any help appreciated....

    OK, I can verify that the file doesn't display in Safari or Preview, so it's not just you. From the error messages in the log, it seems that the built-in rendering routines see the file as corrupt. It does display correctly in QuickTime Player 7, GraphicConverter, OpenOffice.org, and LightZone. You'll either have to use another application to view the files, or get the author to save them in a different format. You can also file bug reports with Adobe and Apple, for all the good that will do.

  • Problems viewing large tif files in preview

    i'm having trouble viewing large tif files in preview. the tif files are on a cdrom - when i click on the file to open it, it opens only the first 250 pages of the file. the file has approximately 3,000 pages. how can i get preview to open the rest of the pages?
    thanks for any suggestions.
    mac mini   Mac OS X (10.4.6)  

    no trick - i didn't create the cdrom but it only has 3 tif files with approximately 3000 pages each, not 3000 large tif files. plus several smaller pdf files, but those aren't giving me any problems.
    i don't know whether they're compressed, but i still can't get more than the first 250 pages to open, even after copying the file to my desktop. if anyone has any other ideas, i'd much appreciate it.
    mac mini   Mac OS X (10.4.6)  

  • Large .bpel file size vs performance

    how does large .bpel file size affect performance,say that I have a process of .9 mgb size with around 10000 line how does this affect the instance creation ,fetching and message creation during the process life cycle.
    Edited by: arababah on Mar 8, 2010 7:23 AM

    Johnk93 wrote:
    MacDLS,
    I recently did a little house-cleaning on my startup drive (only 60Gb) and now have about 20Gb free, so I don't think that is the problem.
    It's probably not a very fast drive in the first place...
    I know that 5MB isn't very big, but for some reason it takes a lot longer to open these scanned files in photoshop (from aperture) than the 5MB files from my camera. Any idea why this is?
    Have a look at the file size of one of those externally edited files for a clue - it won't be 5MB. When Aperture sends a file out for editing, it creates either a PSD or an uncompressed TIFF after applying any image adjustments that you've applied in Aperture, and sends that out. Depending on the settings in Aperture's preferences this will be in either 8-bit or 16-bit.
    As a 16-bit uncompressed TIFF, a 44 megapixel image weighs in at a touch over 150MB...
    Ian

  • Large PDF file sizes when exporting from InDesign

    Hi,
    I was wondering if anyone knew why some PDF file sizes are so large when exporting from ID.
    I create black and white user manuals with ID CS3. We post these online, so I try to get the file size down as much as possible.
    There is only one .psd image in each manual. The content does not have any photographs, just Illustrator .eps diagrams and line drawings. I am trying to figure out why some PDF file sizes are so large.
    Also, why the file sizes are so different.
    For example, I have one ID document that is 3MB.
    Exporting it at the smallest file size, the PDF file comes out at 2MB.
    Then I have another ID document that is 10MB.
    Exporting to PDF is 2MB (the same size as the smaller ID document)... this one has many more .eps's in it and a lot more pages.
    Then I have another one that the ID size is 8MB and the PDF is 6MBwhy is this one so much larger than the 10MB ID document?
    Any ideas on why this is happening and/or how I can reduce the file size.
    I've tried adjusting the export compression and other settings but that didn't work.
    I also tried to reduce them after the fact in Acrobat to see what would happen, but it doesn't reduce it all that much.
    Thanks for any help,
    Cathy

    > Though, the sizes of the .eps's are only about 100K to 200K in size and they are linked, not embedded.
    But they're embedded in the PDF.
    > It's just strange though because our marketing department as an 80 page full color catalog that, when exported it is only 5MB. Their ID document uses many very large .tif files. So, I am leaning toward it being an .eps/.ai issue??
    Issue implies there's something wrong, but I think this is just the way
    it's supposed to work.
    Line drawings, while usually fairly compact, cannot be lossy compressed.
    The marketing department, though, may compress their very large TIFF
    files as much as they like (with a corresponding loss of quality). It's
    entirely possible to compress bitmaps to a smaller size than the
    drawings those bitmaps were made from. You could test this yourself.
    Just open a few of your EPS drawings in Photoshop, save as TIFF, place
    in ID, and try various downsampling schemes. If you downsample enough,
    you'll get the size of the PDF below a PDF that uses the same graphics
    as line drawing EPS files. But you may have to downsample them beyond
    recognition...
    Kenneth Benson
    Pegasus Type, Inc.
    www.pegtype.com

  • Freezing Finder with PSD/TIF files - HELP!

    Hello everyone,
    First of all, I am pretty much ******. I've invested over 2400€ for my brand new Imac and now, out of nothing, my finder in Snow Leopard OSX 10.6.4 starts to hang/freeze whenever I get into the folder with large PSD/TIF files.
    It started with large TIF files. I have placed them along with other PSD projects and they range about 50-300MB each. These files are in a specific project folder. Now I get into the folder, and it doesn't matter how I view it. Either in list or symbol view. I scroll down, bam! Pizza of death or wheel of death, and no, I am not able to force quit the finder.
    I am like what the ****..?! I just swapped over to the Mac. I just fell the love of the Mac slogan: "It just works". Yeah, I can feel it..
    I've installed Xee image viewer before, deleted it again, no change. I even deleted Norton Anti Virus, no change, problem still exist. I've been up till around 5AM with no sleep, trying to figure out what has caused this problem. I think loosing it. I googled like mad, but there were no significant solutions.
    HOWEVER, if I start up Pathfinder, and go to the projects folder, I can seamlessly view all project files PSD/TIF within the Finder and quicklook. It wont hang OSX. This is my current solution... but I don't want to use PathFinder all the time, just to get to my projects folder.
    I really need your help. (x_x)
    Thanks

    Wow, that was a fast response, thank you.
    I did remove Norton after the incidence, and yet, the problem exist.
    I used to open my project files with Photoshop CS5. Before I open anything, I browse through my project data within the Finder, i.e list view + quick view, I just scroll up and down within the window to check out the data name before I open it.
    Everything works just fine. I used to use Xee image viewer and I can literally open and view any image. But as soon the files get larger in size, and the more files I got in one folder - like 100+, the more it was likely, that my Finder hangs. And it still does.
    It starts to freeze immediately within seconds or when I scroll down more and there it goes, spinwheel goes nuts and I have to restart my Imac (omg). I did that like 8-10 times now. I left my Mac on when I went to bed for 2h, and after I checked it again, the spinwheel didn't do anything then spin and I could not quit Finder. Everything hangs.
    All I need to do, is to open my Projects folder, and boom, freeze. I thought it was due to the large TIF files. I deleted them via Adobe Bridge, because I can't go to that folder with Finder, otherwise it'll freeze AGAIN. Now I had the PSD files left, they are about 50-300MB big, and I got around 100 of them in one folder. Issue still remains.
    I deleted almost everything, checked my startup icons and so forth, but I did not have the problem before, or I didn't notice it.
    I can however browse lots of PSD files with smaller sizes AND as I mentioned in the above post. I can view my project folder with PATHFINDER and ADOBE BRIDGE with absolutely NO single problem.
    I guess it's a Finder-Preview-PSD Extension- etc problem? I got the Mac like a month and now I got this. Gladly no one can tell me I got this issues due to some weird Trojans or "badly configured" computer.

  • Deploy  large war file (131M) error

    hi,
    I want to depoly an war (131M) application to oracle 10g application server (9.04),when I deploy to server use this command :dcmctl deployapplication -f ../myapp.war -a my aaa -rc /myapp, error occurred as below:
    ADMN-705003
    评估阶段失败。这可能是由于评估中使用的适配器的异常错误造成的。
    基本异常错误::
    java.lang.OutOfMemoryError:null
    请致电 Oracle 技术支持。
    java.lang.OutOfMemoryError
    at com.evermind.server.rmi.RMIConnection.EXCEPTION_ORIGINATES_FROM_THE_R
    EMOTE_SERVER(RMIConnection.java:1527)
    at com.evermind.server.rmi.RMIConnection.invokeMethod(RMIConnection.java
    :1480)
    at com.evermind.server.rmi.RemoteInvocationHandler.invoke(RemoteInvocati
    onHandler.java:55)
    at com.evermind.server.rmi.RecoverableRemoteInvocationHandler.invoke(Rec
    overableRemoteInvocationHandler.java:22)
    at __Proxy0.deploy(Unknown Source)
    at oracle.ias.sysmgmt.deployment.j2ee.runtime.LocalDeploy.deployOnSingle
    Instance(Unknown Source)
    at oracle.ias.sysmgmt.deployment.j2ee.runtime.LocalDeploy.doExecute(Unkn
    own Source)
    at oracle.ias.sysmgmt.deployment.j2ee.runtime.RuntimeIf.execute(Unknown
    Source)
    at oracle.ias.sysmgmt.deployment.j2ee.adapter.DeploymentAdapter.doEvalua
    teDeploy(Unknown Source)
    at oracle.ias.sysmgmt.deployment.j2ee.adapter.DeploymentAdapter.evaluate
    (Unknown Source)
    at oracle.ias.sysmgmt.task.TaskMaster.sync_evaluate(Unknown Source)
    at oracle.ias.sysmgmt.task.TaskMaster.internal_evaluate(Unknown Source)
    at oracle.ias.sysmgmt.task.RemoteEvaluate.execCommand(Unknown Source)
    at oracle.ias.sysmgmt.task.DaemonWorker.run(Unknown Source)
    I have modified dcmctl as
    if %found%==true %ex% -jar -Xms256M -Xmx1024M %jarpath% %cmdString%
    if %found%==false %ex% -jar -Xms256M -Xmx1024M %jarpath% %cmdString% -o %ohome%
    these are the last two lines in the dcmctl.bat file.
    I search on metalink and found this document:
    Subject: On Deploying an EAR file You Receive Errors: ADM-705003 and/or Memory Errors
    Doc ID: Note:290662.1 Type: PROBLEM
    Last Revision Date: 26-NOV-2004 Status: MODERATED
    The information in this document applies to:
    Web Services - Version: 9.0.4
    Oracle Containers for J2EE - Version: 9.0.0.0 to 10.10.10
    This problem can occur on any platform.
    j2ee deployments (WAR, EAR files) using dcmctl or dcmservlet
    Symptoms
    On deploying an EAR file, get this error on the server:
    ADMN-705003 and / or this error on server or client:
    java.lang.OutOfMemoryError
    Cause
    Too large EAR file
    Fix
    -- use compression in creating the archive
    NOTE: in jDeveloper, the default is not-to-compress. See the checkbox under the Options page of the deployment profile's Properties
    (right-cleck deployment profile icon, select Properties then find the Options page)
    - if error from client, increase jDeveloper deployment-client heap size:
    (right-cleck deployment profile icon, select Properties then find the Options page)
    - if error from server, increase iAS deployment-server heap size
    In EMWebsite for the target OC4J Node, select Server Properties and Java Options
    set to e.g., -XmX512M for 512M
    References
    Note 223063.1 - Installing 9iAS Fails While Deploying OC4J Applications With No space left on device Error
    but when I edit java Options any content,oc4j home instance can't startup,how can i do ?
    thank you very much
    lixinzhu

    Hi Li,
    You can have two parallel installations of Application Server, which I would strongly recommend. One instance of 10.1.2.0.2 (upgraded to 10.1.2.2) for Forms and Reports (and maybe Portal, Discoverer etc too), and one instance of 10.1.3.2 for your J2EE code.
    If you have multiple servers, then you can cluster those OC4J's, which is great. If you have two servers, you can as well have one for 10.1.2 and one for 10.1.3 making better performance over all. However, there's no problem having both 10.1.2 and 10.1.3 on the same server.
    The 10.1.3 server can use the Web Cache (as well as Apache) from the 10.1.2 server.
    Regards,
    Martin

  • Signal Express Large TDMS File Recording Error

    Hello,
    I have the following application and I am looking for some tips on the best way to approach the problem with Signal Express:
    I am attempting to using Signal Express 2009 (Sound and Vibration Assistant) to collect random vibration data on three channels over an extended period of time -- about 20 hours total.  My sample rate is 2kHz.  Sampling at that rate over that period of time invovles the creation of a very large TDMS file, which is intended for various types of analysis in signal express later or some other application later on.  One of the analysis functions to be done is a PSD (Power Spectral Density) plot to determine the vibration levels distributed over a band of frequencies during the log. 
    My original solution was to collect a single large TDMS file.  I did this with Signal Express recording options configured to save and restart "in current log" after 1 hour worth of data is collected.  I configured it this way because if there is a crash/sudden loss of power during data collection, I wanted to ensure that only up to an hours worth of data would be lost.  I tested this option and the integrity of the file after a crash by killing the SignalExpress process in the middle of recording the large TDMS file (after a few save log file conditions had been met).  Unfortunately, when I restart signal express and try to load the log file data in playback mode an error indicating "TDMS Data Corrupt" (or similiar) is displayed.  My TDMS file is large, so it obviously contains some data; however, Signal Express does not index its time and I can not view the data within the file.  The .tdms_index file is also present but the meta data.txt file is not generated.  Is there any way to insure that I will have at least partially valid data that can be processed from a single TDMS file in the event of a crash during mid-logging?   I don't have too much experience dealing with random vibration data, so are there any tips for generating vibration level PSD curves for large files over such a long time length?
    My solution to this problem thusfar has been to log the data to seperate .TDMS files, about an hour in length each.  This should result in about 20 files in my final application.  Since I want to take a PSD, which ends up being a statistical average over the whole time period. I plan on generating a curve for each of these files and averaging all 20 of them together to get the overall vibration PSD curve for the 20 hour time period.

    JMat,
    Based on the description of your application, I would recommend writing the data to a "new log" every hour (or more often). Based on some of my testing, if you use "current log" and S&V Assistant crashes, the entire TDMS file will be corrupted. This seems consistent with what you're seeing.
    It would be good if you could clarify why you're hoping to use "current log" instead of "new log". I'll assume an answer so I can provide a few more details in this response. I assume it's because you want to be able to perform the PSD over the entire logged file (all 20 hours). And the easiest way to do that is if all 20 hours are recorded in a continuous file. If this is the case, then we can still help you accomplish the desired outcome, but also ensure that you don't lose data if the system crashes at some point during the monitoring.
    If you use "new log" for your logging configuration, you'll end up having 20 TDMS files when the run is complete. If the system crashes, any files that are already done writing will not be corrupted (I tested this). All you need to do is concatenate the files to make a single one. If this would work for you, we can talk about various solutions we can provide to accomplish this task. Let me know.
    Now there is one thing I want to bring to your attention about logging multiple files from SignalExpress, whether you use "current log' or "new log". The Windows OS is not deterministic. Meaning that it cannot guarantee how long it takes for an operation to complete. For your particular application, this basically means that between log files there will be some short gap in time that the data is not being saved to disk. Based on my testing, it looks like this time could be between 1-3 seconds. This time depends heavily on how many other applications Windows has running at the same time.
    So when you concatenate the signals, you can choose to concatenate them "absolutely", meaning there will be a 1-3 second gap between the different waveforms recorded. Or you can concatenate them to assume there is no time gap between logs, resulting in a pseudo-continuous waveform (it looks continuous to you and the analysis routine).
    If neither of these options are suitable, let me know.
    Thanks, Jared 

  • Out of Memory Error and large video files

    I created a simple page that links a few large video files (2.5 gig) total size. On Preview or FTP upload Mues crashes and gives a Out of Memory error. How should we handle very large files like this?

    Upload the files to your host using an FTP client (i.e. Filezilla) and hyperlink to them from within your Muse site.
    Muse is currently not designed to upload files this large. The upload functionality takes the simple approach of reading an entire linked file into RAM and then uploading it. Given Muse is currently a 32-bit application, it's limited to using 2Gb of RAM (or less) at any given time regardless of how much RAM you have physically installed. We should add a check to the "Link to File..." feature so it rejects files larger than a few hundred megs and puts up a explanation alert. (We're also hard at work on the move to being a 64-bit app, but that's not a small change.)
    In general your site visitor will have a much better experience viewing such videos if you upload them to a service like YouTube or Vimeo rather than hosting them yourself. Video hosting services provide a huge amount of optimization of the delivery of video that's not present for a file hosted on standard hosting (i.e. automatic resizing of the video to the appropriate resolution for the visitor's device (rather than potentially downloading a huge amount of unneeded data), transcoding to the video format required by the visitor's browser (rather than either having to due so yourself or have some visitors unable to view your video), automatic distribution of a highly viewed video to multiple data centers for better performance in multiple geographies, and no doubt tons of other stuff I'm not thinking of or am ignorant of.

  • In camera raw how large should I open my raw file before converting it to a TIF file?

    in camera raw how large should I open my raw file before converting it to a TIF file---2736 x 3648 (10.0 MP), 3072x4096 (12.7 MP) or 3840x5120? (19.7 MP).  I want a sharp TIF file.   I'm shooting with an Olympus E-510, 10.0 MP camera?
    thanks - Ken
    [email protected]

    rasworth wrote:
    There is no advantage to saving or opening other than at the native resolution.
    Actually, not entirely true. In the case of Fuji DSLR's you would do better to double the rez in Camera Raw as that matches the interpolation that the Fuji software does in their higher quoted "effective resolutions"(it ain't real resolution mind you but it can benefit certain image types).
    If you know for an absolute fact you need more resolution than the native file has, you really might want to test upsampling in Camera Raw as it has a newly tuned upsample (put in in ACR 5.2) that is an adaptive Bicubic algorithm that is either the normal Bicubic or Bicubic Smoother depending on the size–and that's something even Photoshop can't do.
    But in general unless you know for a fact you need the image bigger (or have a Fuji camera) processing at the file's native size is the most efficient.

  • How to increase performance speed of Photoshop CS6 v13.0.6 with trasformations in LARGE image files (25,000 X 50,000 pixels) on IMac3.4 GHz Intel Core i7, 16 GB memory, Mac OS 10.7.5?   Should I purchase a MacPro?

    I have hundreds of these large image files to process.  I frequently use SKEW, WARP, IMAGE ROTATION features in Photoshop.  I process the files ONE AT A TIME and have only Photoshop running on my IMac.  Most of the time I am watching SLOW progress bars to complete the transformations.
    I have allocated as much memory as I have (about 14GB) exclusively to Photoshop using the performance preferences panel.  Does anyone have a trick for speeding up the processing of these files--or is my only solution to buy a faster Mac--like the new Mac Pro?

    I have hundreds of these large image files to process.  I frequently use SKEW, WARP, IMAGE ROTATION features in Photoshop.  I process the files ONE AT A TIME and have only Photoshop running on my IMac.  Most of the time I am watching SLOW progress bars to complete the transformations.
    I have allocated as much memory as I have (about 14GB) exclusively to Photoshop using the performance preferences panel.  Does anyone have a trick for speeding up the processing of these files--or is my only solution to buy a faster Mac--like the new Mac Pro?

  • I need to sort very large Excel files and perform other operations.  How much faster would this be on a MacPro rather than my MacBook Pro i7, 2.6, 15R?

    I am a scientist and run my own business.  Money is tight.  I have some very large Excel files (~200MB) that I need to sort and perform logic operations on.  I currently use a MacBookPro (i7 core, 2.6GHz, 16GB 1600 MHz DDR3) and I am thinking about buying a multicore MacPro.  Some of the operations take half an hour to perform.  How much faster should I expect these operations to happen on a new MacPro?  Is there a significant speed advantage in the 6 core vs 4 core?  Practically speaking, what are the features I should look at and what is the speed bump I should expect if I go to 32GB or 64GB?  Related to this I am using a 32 bit version of Excel.  Is there a 64 bit spreadsheet that I can us on a Mac that has no limit on column and row size?

    Grant Bennet-Alder,
    It’s funny you mentioned using Activity Monitor.  I use it all the time to watch when a computation cycle is finished so I can avoid a crash.  I keep it up in the corner of my screen while I respond to email or work on a grant.  Typically the %CPU will hang at ~100% (sometimes even saying the application is not responding in red) but will almost always complete the cycle if I let it go for 30 minutes or so.  As long as I leave Excel alone while it is working it will not crash.  I had not thought of using the Activity Monitor as you suggested. Also I did not realize using a 32 bit application limited me to 4GB of memory for each application.  That is clearly a problem for this kind of work.  Is there any work around for this?   It seems like a 64-bit spreadsheet would help.  I would love to use the new 64 bit Numbers but the current version limits the number of rows and columns.  I tried it out on my MacBook Pro but my files don’t fit.
    The hatter,
    This may be the solution for me. I’m OK with assembling the unit you described (I’ve even etched my own boards) but feel very bad about needing to step away from Apple products.  When I started computing this was the sort of thing computers were designed to do.  Is there any native 64-bit spreadsheet that allows unlimited rows/columns, which will run on an Apple?  Excel is only 64-bit on their machines.
    Many thanks to both of you for your quick and on point answers!

  • Performance Problem in parsing large XML file (15MB)

    Hi,
    I'm trying to parse a large XML file(15 MB) and facing a clear performance problem. A Simple XML Validation using the following code snippet:
    DBMS_LOB.fileopen(targetFile, DBMS_LOB.file_readonly);
    DBMS_LOB.loadClobfromFile
    tempCLOB,
    targetFile,
    DBMS_LOB.getLength(targetFile),
    dest_offset,
    src_offset,
    nls_charset_id(CONSTANT_CHARSET),
    lang_context,
    conv_warning
    DBMS_LOB.fileclose(targetFile);
    p_xml_document := XMLType(tempCLOB, p_schema_url, 0, 0);
    p_xml_document.schemaValidate();
    is taking 30 mins on a HP-UX (4GB ram, 2 CPU) machine (Oracle version : 9.2.0.4).
    Please explain what could be going wrong.
    Thanks In Advance,
    Vineet

    Thanks Mark,
    I'll open a TAR and also upload the schema and instance XML.
    If i'm not changing the track too much :-) one more thing in continuation:
    If i skip the Schema Validation step and directly insert the instance document into a Schema linked XMLType table, what does OracleXDB do in such a case?
    i'm getting a severe performance hit here too... the same file as above takes almost 40 mins to Insert.
    code snippet:
    DBMS_LOB.fileopen(targetFile, DBMS_LOB.file_readonly);
    DBMS_LOB.loadClobfromFile
    tempCLOB,
    targetFile,
    DBMS_LOB.getLength(targetFile),
    dest_offset,
    src_offset,
    nls_charset_id(CONSTANT_CHARSET),
    lang_context,
    conv_warning
    DBMS_LOB.fileclose(targetFile);
    p_xml_document := XMLType(tempCLOB, p_schema_url, 0, 0);
    -- p_xml_document.schemaValidate();
    insert into INCOMING_XML values(p_xml_document);
    Here table INCOMING_XML is :
    TABLE of SYS.XMLTYPE(XMLSchema "http://INCOMING_XML.xsd" Element "MatchingResponse") STORAGE Object-
    relational TYPE "XDBTYPE_MATCHING_RESPONSE"
    This table and type XDBTYPE_MATCHING_RESPONSE were created using the mapping provided in the registered XML Schema.
    Thanks,
    Vineet

  • Whenever i try to download a rather large file i continue to get the "could not read source file" error. Tried new profile, uninstalling and looking for the compreg.dat file to delete nothing is working. Please help

    whenever i try to download a rather large file i continue to get the "could not read source file" error. Tried new profile, uninstalling and looking for the compreg.dat file to delete nothing is working. Please help

    Did you reinstall CS3 after CC?
    For that matter, doing an in-place upgrade on the OS is always a gamble with Adobe programs. Reinstalling all the versions you need, in order, would probably solve your problem.
    And you shouldn't need to save as IDML after opening the .inx in CC.

  • Cannot load large CSV files in SignalExpress("Not enough memory to complete this operation" error)

    Hi guys,
    I'm new here and just  have browsed
    some of the related topics here regarding my problem but could not seem
    to find anything to help me fix this problem so I decided to post this.
    I currently have a saved waveform from an oscilloscope that is quite
    big in size(around 700MB, CSV file format) and I want to view this on
    my PC using SignalExpress. Unfortunately when I try to load the file
    using "Load/Save Signals -> Load From ASCII", I always get the "Not
    enough memory to complete this operation" error. How can we view and
    analyze large waveform files in SignalExpress? Is there a workaround on
    this? 
    Thanks,
    Louie
    P.S.>I'm very new to Signal Express and haven't modified any settings on it. 

    Hi Louie,
    Are you encountering a read-only message when you tried to save the boot.ini file? If so, you can try this method: right-click on My Computer >> Select "Properties", and go to the "Advanced" tab. Select "Settings", and on the next screen, there is a button called Edit. If you click on Edit you should be able to modify the "/3GB" tag in boot.ini. Are you able to change it in this manner? After reboot, you can reload the file to see if it helps.
    To open a file in SignalExpress, a contiguous chunk of memory is required. If SignalExpress cannot find a contiguous memory chunk that is large enough to hold this file, then an error will be generated. This can happen when fragmentation occurs in memory, and fragmentation (memory management) is managed by Windows, so unfortunately this is a limitation that we have.
    As an alternative, have you looked at NI DIAdem before? It is a software tool that allows users to manage and analyze large volumes of data, and has some unique memory management method that lets it open and work on large amounts of data. There is an evaluation version which is available for download; you can try it out and see if it is suitable for your application. 
    Best regards,
    Victor
    NI ASEAN
    Attachments:
    Clipboard01.jpg ‏181 KB

Maybe you are looking for

  • Message Header - Chinese characters - problem

    Hi, Chinese Character : ���� ���� ���� Unicode Character for the above chinese characters : #50516;#54840; #54869;#51064; #47928;#44396; ('&' should be added before each value - i removed intentionally because this website displays equivalent chinese

  • My new clio will not pair with my iphone5s,

    renault are nice enough but cant help, hubby updated media nav , it made no difference,  it says its paired but isnt, and now and again handsfree not connected shows under the `medianav connected, ?

  • Help please new app user!

    Hi I have recently downloaded a free app named paint sparkles for my daughter & I have upgraded to allow the app to do more, I currently have this on my iPad, should the upgrade also show on my iPhone 5, as it is still showing the version without the

  • Impact of database upgrade in Bw delta queue

    Hi Gurus, We are going to upgrade the R/3 oracle database. I suppose that it have an impact in the BW delta queue, and that before the upgrade the logistics queues should be uploaded to BW. I'm right? exits some note or some checklist about wihch act

  • Updating my Apple ID seems to be bugged

    Hi, I'm trying to update my Apple ID via the management pages: https://appleid.apple.com/cgi-bin/WebObjects/MyAppleId.woa/ When I modify my shipping address, or try to change my home telephone number, the changes just don't seem to get saved. I click