Recovery EUL from EEX file - Not enough memory

Hi,
I have the following problem.
I need to recover the EUL from a. EEx file.
At the time of the import and 65 porcient, in step 3/4 an error "Not enough memory". and aborts
The. Ex has a size of 2GB
Anyone have idea why this happens?
thank you very much

Hello
Your problem is exacltly as described, you are out of memory. Presumably you are logging in to Discoverer Admin and running the import from within the tool? Make sure you have no other applications open and make sure your machine has the maximum amount of memory installed as you can. If using XP you will need to make sure you have 4GB RAM, although XP can only address 3.5 GB.
If this doesn't work then try doing the install from either the DOS command line or the Java command line. These have a much smaller footprint and I have known several instances where Discoverer Admin ran out of memory only for the command line to load it ok.
Hope this helps
Michael

Similar Messages

  • Opening PSD File - Not Enough Memory

    I created a sequence of images from Particle Illusion in a PSD format. When I open one of the images in Photoshop (CS3), it opens - no problem. When I try to open it in Illustrator CS3, I get the message "Not enough memory to place the file".

    On Windows XP 64, opening PSD files with Illustrator CS3 does not seem to work.
    What you can do to resolve this issue is to save the file as PNG in Photoshop. That can be opened in Illustrator and transprency is preserved.
      www.revis3d.com

  • Photoshop CS2 Cannot read PNG files--"Not Enough Memory"

    Hi! I have a bit of a problem. I've scoured the interwebs for days but can't find a solution.
    My copy of PSCS2 was able to open PNGs in the past, but it just up and decided it doesn't want to anymore. I'm serious--I can't think of any major overhaul or anything I did to cause it, but now it won't read them correctly. It says there's "not enough memory," which doesn't really make sense, when I have a Macbook Pro and nothing running in the background. On top of that, the particular files I've been trying to open are barely 100kb. I've used canvases upwards of ten thousand square pixels in dimension with no problem.
    This is with any PNG file. It's not an "incorrect extension" problem, because I made these files and just need to reopen them to edit things.
    Here are some screenshots that show what's been happening; nothing exciting, but just in case someone notices something I don't:
    That's the picture I want to open.
    And that's what pops up.
    :/ I'm a little dumbfounded. Halp?

    Now, this is strange.
    I restarted my computer and now the message has changed--to a simple one, but albeit still confusing.
    Sigh.
    I didn't do anything but restart my computer.

  • When opening files "Not enough memory to load TIFF file."

    Mac just had a new build of OSX 10.6.8.
    Total memory 4gb of which over 3gb is free.
    Open any image gives a memory error "Not enough memory to load TIFF file"

    Open the linked tiff file(s) in photoshop, and resave with LZW Compression (OFF I Believe). I know you can place an lzw tiff adn usually works fine, but I believe i had this in the past and turned it off/chanegd the RLE settings and Illustrator was made happy. Also look for if the image is  photoshop >> image >> 8 bits/channel.

  • Cannot open 705 MB tiff file - Not enough memory

    I have PS CS3 and am running a Dell precision workstation 690 with 4GB ram, 1.6GHz Xeon duo core processor, NVIDIA Quadoro FX 3450 video card. I have a tiff that is 705.4 MB and when I try to open it an error message pops up and reads: could not compelte your request because there is not enought memory (RAM).
    I have gone into preferences/performance and moved the memory usage up to 70%. Do I really need to purchase more RAM for the computer?

    John,
    Try closing all your other apps and see if that helps, or reboot and try it before opening anything else. Use Photoshop->File->Open instead of Bridge to eliminate that as a cause.
    Also, as Michael suggested, you need *lots* of empty disk space for Photoshop to open a file that large. I work with images that size a few time a year and have never had an issue, even when I had an old P4 with 2G RAM a couple of years ago. So your system should be up to the task. I've even worked (slowly) on 2GB images on the old P4 so your 4GB RAM is not the culprit. But I've always used an empty 2nd hard disk (my fastest drive) for Photoshop to use for its scratch disk.

  • Cannot Remove Backup files - not enough memory to support this procedure

    I Have SL510 and Windows 7 Proffessional. When I go to Thinkvantage R&R and click advanced dropdown box to delete backups there are no backups shown which can be deleted. I can however view backups but there are no buttons to delete. I'm stuck with 120GB of backups out of total capacity of 280Gb
    Help please.
    Moderator edit: Matched subject to content.
    Solved!
    Go to Solution.

    I have now eventually found a way to see the backups and to tick the box to delete selected backups. After running for a while I get a message that the system does not have enough memory to support this procedure.

  • Can't open file, not enough memory (RAM)

    This message pops up when I try to open certain files on Photoshop CS and it's ridiculous because I have loads of memory, and it previously opened much bigger files with no problem.  Sometimes it works if I close and open Photoshop and try again, but it isn't this time.  Please help...
    Thankyou

    I have discovered that the problem is that the TIFF files I am tying to open are 32 bit files.  There are several references on the Internet to Photoshop not being able to open these files   These are output files produced by the astro-imaging software Deep Sky Stacker.  Within DSS  I can save the 32 bit TIFFs  as 16 bit files which photoshop CS5 can open fine.
    Does anyone know whether Photoshop CS5 can open 32 bit tiff files at all?  Would it help if I could update my version of Photoshop (see update problem above)?

  • HT4910 If I delete tv show from iPad because not enough memory can I redownload later

    I've downloaded a tv series onto my iPad but not all the episodes downloaded due to the amount of memory on my iPad and what was downloaded was not in order of episodes. Can I delete a few of the episodes and then redownload them at a later date?  Thanks

    You can re-download content purchased from the iTunes store (availability varies depending on location) using the purchased option at the bottom of the screen of the iTunes app on your iOS device.

  • Exporting from Illustrator to Photoshop: Not Enough Memory To Save File

    Exporting from Illustrator to Photoshop: Not Enough Memory To Save File
    I have plenty of memory (6 GB) and the files I try to export are not very big (some less than 5 MB). Yet, I keep getting the above message: Not Enough Memory To Save File
    Any suggestions?
    Many thanks!

    Up-thread, John Joslin mentioned that "memory" is more than RAM. It is RAM, Virtual Memory, Scratch Disk, and is also governed by the allocations for your Adobe programs (in this case).
    The first thing that I would explore is the Windows Virtual Memory (Page File) settings. This ARTICLE might give you some tips.
    Also, see this ARTICLE on setting up a computer for an editing session. It's geared more towards doing video editing, but applies to PS work too.
    Then, go to Edit>Preferences in both PS and AI, and check the allocation of your memory. Too often, people run that up, and then they starve the OS, or any other programs.
    Good luck,
    Hunt

  • Server has not enough memory for operation (Some .rpt files not removing from Temp folder )

    We have web application developed in ASP.NET 4.0 ftramework and published on IIS. And we are using 13_0_8 version of CR.
    I am creating report files and exporting these as pdf. And I am disposing streams and report documents at the end. Initially, there wasn't any problem and temporary files which are created by CrystalReport were deleting all. But, now requests to the web application increeased to about 50.000 in a day and now some .rpt files are staying in Temp folder and I can't delete them. After recycling application pool all files are removed by IIS. Then, after 1 or 2 hours new .rpt files are creating in Temp folder. And after somewhile, application throws Server has not enough memory for operation. And, IMHO the reason is temp files. Here is the code I am using to export report as pdf.
    Questions:
    1. Is the reason of this exception is temp files in Temp folder?
    2. What is wrong in that code?
    ReportDocument report = DownloadPDF.GetReport(id);
       MemoryStream stream = (MemoryStream)report.ExportToStream(CrystalDecisions.Shared.ExportFormatType.PortableDocFormat);
       Response.ContentType = "application/pdf";
       Response.AddHeader("content-disposition", "attachment; filename=" + id+ ".pdf");
      report.Close();
      report.Dispose();
       try
       Response.BinaryWrite(stream.ToArray());
       Response.End();
       catch (Exception)
       finally
      stream.Flush();
      stream.Close();
      stream.Dispose();
    Here is the StackTrace

    Hi Farhad
    At 50,000 requests, you are more than likely running into the CR engine limit. E.g.; you're pushing way too hard... The following will be good reads for you:
    Crystal Reports 2008 Component Engine Scalability | SCN
    (The above doc does apply to current versions of CR - e.g.; no changes.)
    Crystal Reports Maximum Report Processing Jobs ... | SCN
    Scaling Crystal Reports for Visual Studio .NET
    Choosing the Right Business Objects SDK for Your Needs
    Choose the Right SDK for the Right Task
    How Can I Optimize Scalability?
    All of the above apply to your version of CR and thus the next question will be; how to proceed:
    1) Bigger, faster servers will not hurt.
    2) Web farms.
    How Do I Use Crystal Reports in a Web Farm or Web Garden?
    3) Crystal Reports Application Server, or perhaps even SAP BusinessObjects BI Platform 4.1
    Crystal Enterprise Report Application Server - Overview
    - Ludek
    Senior Support Engineer AGS Product Support, Global Support Center Canada
    Follow us on Twitter

  • How can I get all photos from iPhoto to automatically back up to iCloud from my Mac OSX Version 10.6.8 operating system.  Not enough memory to upgrade.

    How can I get all photos from iPhoto to automatically back up to iCloud from my Mac OSX Version 10.6.8 operating system.  Not enough memory to upgrade.

    You can't.  iCloud is not for general file backup from a Mac. It's for backup up and syncing data between mobile devices and and Macs and  The following is from this Apple document: iCloud: Backup and restore overview.
    iCloud automatically backs up the most important data on your (mobile) device using iOS 5 or later. Once you have enabled Backup on your iPhone, iPad, or iPod touch .....
    What is backed up
    You get unlimited free storage for:
    Purchased music, movies, TV shows, apps, and books
    Notes: Backup of purchased music is not available in all countries. Backups of purchased movies and TV shows are U.S. only. Previous purchases may not be restored if they are no longer in the iTunes Store, App Store, or iBookstore.
    Some previously purchased movies may not be available in iTunes in the Cloud. These movies will indicate that they are not available in iTunes in the Cloud on their product details page in the iTunes Store. Previous purchases may be unavailable if they have been refunded or are no longer available in the iTunes Store, App Store, or iBookstore.
    You get 5 GB of free iCloud storage for:
    Photos and videos in the Camera Roll
    Device settings (for example: Phone Favorites, Wallpaper, and Mail, Contacts, Calendar accounts)
    App data
    Home screen and app organization
    Messages (iMessage, SMS, and MMS)
    Ringtones
    Visual Voicemails
    But not from a Mac.  If you want to backup your photos and other important files I suggest you get an external hard drive and use  it with Time Machine.
    OT

  • CR2008 Not enough memory while exporting reports from Crystal Reports 2008

    I have recently upgraded our Crystal Reports version from Crystal Reports Basic for Visual Studio 2008 to Crystal Reports 2008. After upgradation I am facing the problem "Memory full.OtherErrorFailed to export the report. Not enough memory for operation" when I am trying to export the report from Crystal Reports 2008 Report viewer, or directly from the code behind. The application is hosted application. The problem occurs in our production environment.
    Server details:
    OS: Windows 2003 Enterprise Edition R2 with SP2
    IIS: IIS 6
    .Net Framework: 3.5
    Application details:
    Hosted application using Crystal Reports 2008 SP 3
    Crystal Reports Viewer version: 12.0.2000.0
    The data binding of the report object is done through an ADODB dataset.
    Web.Config:
    <configuration xmlns="http://schemas.microsoft.com/.NetConfiguration/v2.0">
      <configSections>
        <sectionGroup name="businessObjects">
          <sectionGroup name="crystalReports">
            <section name="printControl" type="System.Configuration.NameValueSectionHandler" />
            <section name="crystalReportViewer" type="System.Configuration.NameValueSectionHandler" />
          </sectionGroup>
        </sectionGroup>
      </configSections>
      <businessObjects>
        <crystalReports>
          <printControl>
            <add key="url" value="http://myserver/mysite/PrintControl.cab" />
          </printControl>
          <crystalReportViewer>
            <add key="documentView" value="weblayout" />
          </crystalReportViewer>
        </crystalReports>
      </businessObjects>
      <appSettings>
        <add key="CrystalImageCleaner-AutoStart" value="true" />
        <add key="CrystalImageCleaner-Sleep" value="60000" />
        <add key="CrystalImageCleaner-Age" value="120000" />
      </appSettings>
      <system.web>
        <httpHandlers>
          <add path="CrystalImageHandler.aspx" verb="GET" type="CrystalDecisions.Web.CrystalImageHandler, CrystalDecisions.Web, Version=12.0.2000.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" />
        </httpHandlers>
        <compilation debug="false">
          <assemblies>
            <add assembly="CrystalDecisions.Data.AdoDotNetInterop, Version=12.0.2000.0, Culture=neutral, PublicKeyToken=692FBEA5521E1304" />
            <add assembly="CrystalDecisions.Shared, Version=12.0.2000.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" />
            <add assembly="CrystalDecisions.ReportAppServer.ClientDoc, Version=12.0.1100.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" />
            <add assembly="CrystalDecisions.Web, Version=12.0.2000.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" />
            <add assembly="CrystalDecisions.Enterprise.InfoStore, Version=12.0.1100.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" />
            <add assembly="CrystalDecisions.Enterprise.Framework, Version=12.0.1100.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" />
            <add assembly="CrystalDecisions.ReportSource, Version=12.0.2000.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" />
            <add assembly="CrystalDecisions.CrystalReports.Engine, Version=12.0.2000.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" />
          </assemblies>
      </system.web>
      <system.webServer>
         <handlers>
             <add name="CrystalImageHandler.aspx_GET" verb="GET" path="CrystalImageHandler.aspx" type="CrystalDecisions.Web.CrystalImageHandler, CrystalDecisions.Web, Version=12.0.2000.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" preCondition="integratedMode" />
         </handlers>
      </system.webServer>
    </configuration>
    Sample Code:
    Report = new CrystalDecisions.CrystalReports.Engine.ReportDocument();
    Report.Load(Server.MapPath(strReportPath));
    Report.SetDataSource(dsReport);
    Creportviewer.ReportSource = Report;
    For exporting the report to PDF
    string Filename = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.InternetCache).ToString(), Guid.NewGuid().ToString() + ".pdf");
    Report.ExportToDisk(ExportFormatType.PortableDocFormat, Filename);
    Clean Up Code: (Page_UnLoad event)
    if (Report != null)
         Report.Close();
         Report.Dispose();
    Creportviewer.ReportSource = null;
    Creportviewer.Dispose();
    dsReport = null;
    GC.Collect();
    GC.WaitForPendingFinalizers();
    Can someone help me resolve the issue.

    The .rpt file size is 14MB with the Data Save option enabled, 12MB without Data Save.  Presumably the 12MB file size is because of the 24bit PNG we have as our background.
    The Designer executes the report in less than a second and we can scroll through all pages and see the image fields perfectly.
    When we Export to PDF, the Designer takes a long time, eventually gets to the 77%, the 7th record and returns "Export report failed" followed by "Memory full".  If we export only page 1 of the 3 pages, it also returns a Memory full error.  However, when the same report is run with only 1 page, that page exports to PDF but with a ridiculously large size and export time.
    The machine has 2GB of physical memory with an 8GB pagefile with Windows 2003 (latest everything).  The process runs up to about 1GB before reporting the memory full error.
    We've also tried a variety of other suggestions posted in the other thread with no success.
    We're happy to provide the RPT file to the Report Team to diagnose the problem.  Ultimately, we need to be able to produce a 15 page report with approximately 45 images.
    Our preferred scenario is fixing problem 2.  The CR Designer seems quite capable of rendering our report and printing it to our third party PDF printer in a timely manner with small size.  However, the API reports memory full.
    The API resides in a dedicated reporting web service with NO other code except for loading the report, setting parameters and printing.  When executing, it uses up to about 1.1GB before reporting the error.
    Are there any other suggestions for fixing what we have?  Are there known problems with large images in reports?  Do we need to lodge a formal support request?
    Regards,  Grant.
    PS.  Grr and my message formatting is lost when I edited this message!!!
    There is a 1500 character limit and then all formatting is removed to save space. Break you posts into separate entries.
    Edited by: grantph on Sep 30, 2009 2:49 AM

  • Not enough memory to complete operation when writing TDMS file.

    Hello,
       I am new to Labview and having a bit of trouble. I attach some code here. What I want to do is sample from a NI 9220 DAQ from 16 channels at 20kHz, while from a second NI 6009 sample from 4 channels at 1000 kHz. I want to append these together and then write to a TDMS file. 
       I have tried to write this code using NIDAQmx VIs but when I have it has resulted in the two DAQs not having the right timing with each other. The 6009 samples for a longer time. 
       I have now tried instead to use the DAQ assistant to read from the two VIs and it works in that TDMS files produced have the correct timing between the two DAQS. However, if I record for more than 2 minutes, in the end I want to end up recording for a much longer time, I have the "Not enough memory to complete operation" message appearing. This still happens even if I get rid of my charts to display the data, and also if I get rid of the NI 6009 completely and just keep the 9220 sampling at 20kHz. It happens even if I repalde my TDMS write and put a write measurement assistant in which I tell it to write a series of files that are each less than 2 minutes long. 
        I think it is something to do with the amount of data I am reading and is being held in memory. What can I do about this? Also, my charts display very slowly, basically evey second when the 20k are read in. However if I lower the amount of data read the charts don't display all the data points. 
        I attach my code, thanks for your help!
        Alfredo 
    Attachments:
    03_02_15.vi ‏688 KB

    alfredog wrote:
    Hello,
       I am new to Labview and having a bit of trouble. I attach some code here. What I want to do is sample from a NI 9220 DAQ from 16 channels at 20kHz, while from a second NI 6009 sample from 4 channels at 1000 kHz. I want to append these together and then write to a TDMS file. 
       I have tried to write this code using NIDAQmx VIs but when I have it has resulted in the two DAQs not having the right timing with each other. The 6009 samples for a longer time. 
       I have now tried instead to use the DAQ assistant to read from the two VIs and it works in that TDMS files produced have the correct timing between the two DAQS. However, if I record for more than 2 minutes, in the end I want to end up recording for a much longer time, I have the "Not enough memory to complete operation" message appearing. This still happens even if I get rid of my charts to display the data, and also if I get rid of the NI 6009 completely and just keep the 9220 sampling at 20kHz. It happens even if I repalde my TDMS write and put a write measurement assistant in which I tell it to write a series of files that are each less than 2 minutes long. 
        I think it is something to do with the amount of data I am reading and is being held in memory. What can I do about this? Also, my charts display very slowly, basically evey second when the 20k are read in. However if I lower the amount of data read the charts don't display all the data points. 
        I attach my code, thanks for your help!
        Alfredo 
    As far as your charts updating very slowly - the way your code is designed, your charts only get data when both 20K samples & 1M samples are done collecting.  Have you tried setting up DAQ assistant for continuous sampling instead of 20K samples or 1M samples?
    -BTC
    New Controls & Indicators made using vector graphics & animations? Click below for Pebbles UI

  • File Error - FCP-6 - "not enough memory" ---DVDPro disk is slightly jerky..

    I have a macbook. 2GHz Intel Core Duo, 1 GB 667 DDr2 SDRAM. Final Cut Pro 6.0.1
    OS 10.4.11
    I have a project file - HD that has worked fine up until now. And wouldn't you know it - it's just
    when I need to be able to burn to quicktime that this occurs.
    There appears to be a continuing "file error: not enough memory" - anytime I open a sequence - now in any project.
    I have done a disk repair with disk utility. I zapped the P-RAM. I've tried dragging the project files to my laptop desktop. Nothing works.
    I have a Lacie external drive with over 150GB of memory left. Another project was shot in HDV -
    and it seems this started to occur after working on that project.
    What else is there? Why is it doing this - of all things, now?
    Please advise - ASAP!?
    Also, one other small detail. If I have my sequence settings in FCP at DVCPRO-HD720p60, and I set my compressor file at 23.98fps - could that be responsible for a subtle jerking in the DVD Studio Pro Disk print? When I pull up the file in DVDpro - and play it on the simulator, the jerking is not present, only in the disk burn.

    The memory that FCP is referring to is RAM. If you're running with less than a couple of gigs, larger projects will report that there's not enough memory... a gig isn't really enough to run FCP anymore. Tiger and Leopard use a LOT of it up so you probably need more RAM.
    The other things that can use up RAM fast are graphics from photoshop, and if they have a blank layer in them, you'll get an out of memory error too. So might check that out, but my guts say you need more RAM for your laptop.
    Jerry

  • ITune fails to burn large audio books:The iTune library file cannot be saved. There is not enough memory.

    Hi,
    I am trying to burn large audible audio book to virtual CD and iTune 11.2.2 keeps crashing midway. After spending hours trying to get information off technical forum, here is the description and possible explanation of what is happening:
    The burn start normally, iTune  create 10 tracks of 8 mn per CD.
    The burn progress pass 10 CD (up to 30 CD).
    The a message box with: "The iTune library cannot be saved. There is not enough memory" appears.
    In the back ground the process keeps going for another 2 or 3 CD then stops.
    This is NOT a memory problem:
    My computer still has over 100 Gb of HDD left
    On my i7, only 4 of the 8 core were used.
    The whole process didn't go over 3.5 Gb of RAM (out of 8).
    I reinstalled, iTune.
    I re-imported the files.
    I raised the processes priority to High.
    From the forums, This problem started past iTune 7.x and x64 bit processors whenever iTune used more that 1.5 Ghz of RAM.
    It does follow the behavior of a 32bit bit system going Out Of Memory.
    The way I see this: Apple has to either create a fix to reuse the current iTune memory properly or create an update to 64bit so it can use whatever RAM is available in the computer.
    Does anyone else have any other idea?
    For those who want to play lawyers, I am not American. In my country, it is legal to remove DRM off content that you legally own. The fact that Audible softwares were made by drunk monkeys is enough of a reason.

    Well, I installed the latest iTunes this AM, works perfectly.  Although for some reason it did lock up after playing a song today.
    Horror stories?  That's a bit overly dramatic.
    No, it isn't.
    The issue you are experiencing is one that has occurred in multiple versions of iTunes, a simple search would have revealed this.
    A simple search was conducted, no one seemed to have an answer.  Do you always adopt such a haughty attitude?  Or was I just lucky.

Maybe you are looking for

  • Error in BW upgrade - SEM Add-on

    Hi, I'm upgrading BW from 3.1 to 3.5 on WIN/MSSQL platform. I got stuck on following error: R3up> ERROR: No add-on catalogue "R3ADDON.PAR" found on "c:\TEMP\put\sem3". It's about BW-SEM component 3.5, I put install CD called: SAP:BW:310:ISUPGR:Add_On

  • ADF Developer's Guide For Forms/4GL b ERRATA /b

    In 27.9 Creating a View Object with Multiple Updatable Entities I believe that the following sentence : When you create a view object with multiple entity usages, you can enable a secondary entity usage to be updatable by selecting it in the Selected

  • Deleting takes long time

    Hi, I have a table with around 12 million rows. It takes a very long time, more than 200 seconds even deleting just one row and passing primary id in where clause. This is a partitioned table. But the query runs fine on other tables similarly partiti

  • I cannot open files - they come up with an error opening this file - access denied.  Help!

    All my adobe files are now the same - they will not open at all - access denied with no reason other than an error!  Help!

  • How to unlock iphone 5 locked to three uk

    I bought a second hand iphone 5, locked to three Uk. I cant contact the original owner of the iphone to request Three to Unlock it. is there anything i can do? What if i buy a sim only 1 Month sim and Register this iphone on to that sim that Way will