Too large of file   ?

My cs5 illustrator is having a really hard time running. I think that part of it is my laptop since it is only a 32bit but I've never so many problems with it before. I made a 20"x30" board layout and imported revit-rendered images into the file. The illustrator file ended up being 1.60 GB.  This seems a little large. And the file had a really hard time saving- it took 20 minutes for it to save and it shut down twice before that.  Is there a way I can reduce my file size without losing my 300dpi resolution?
Thanks in advance.

When you save with PDF-compatibility, you will have all items doubled in the file. Plus: all images will be embedded in the PDF part.
Illustrator still is a 32-bit app

Similar Messages

  • Mail  File too large

    I tried to send too large a file in Mail and now I can't delete it. It has reappeared as 8 copies in recovered mail files and when I try to delete it, it comes back again. My entire computer is bogged down. What can I do?

    Well, you have to login to the webmail version of AOL and delete that message there, because that's where the hangup is happening. Then you have to empty the trash there in AOL on the web.
    After that, take all your Mail accounts offline as soon as Mail shows you the Message Viewer window. Then look for the original copy of the email you tried to send through AOL in Mail and delete it, along with any copies that might also be present (I'm guessing that you've set it up as an IMAP account); be sure to empty the Trash in Mail and check the Trash folder to be sure there's nothing in it.
    Note that if you do have your AOL account setup as an IMAP account, all your mail is stored on the server, so unless you delete things yourself by logging into AOL on the web, all messages that you send are stored there, regardless of whether or not they actually succeed in getting sent. With IMAP accounts, you can't delete email from within Mail; it will be replaced as soon as you check your mail again.

  • LR 3.3RC : Won't update metadata to (too large ?) psd

    On XP 32 SP3
    (PSD on PS CS5 : Maximize PSD File Compatibility is on "Always" - Generate Composite was OK)
    After modification of 16b large psd file in LR 3.3RC
    I get exclamation mark in filmstrip thumbnail
    and "Sidecar File Has Conflict" in the balloon.
    "Retry Metadata Export" not effective.
    Have I reached some kind of limit?

    I don't know what caused this error, but the direct answer to your question is NO, you have not reached some sort of limit.
    I don't think there is such a thing as a too large .PSD file, and "16b" (whatever you mean by that) doesn't sound large to me.
    Are you sure you have write access to the folder containing this photo, and to the .psd file itself?

  • Deploy  large war file (131M) error

    hi,
    I want to depoly an war (131M) application to oracle 10g application server (9.04),when I deploy to server use this command :dcmctl deployapplication -f ../myapp.war -a my aaa -rc /myapp, error occurred as below:
    ADMN-705003
    评估阶段失败。这可能是由于评估中使用的适配器的异常错误造成的。
    基本异常错误::
    java.lang.OutOfMemoryError:null
    请致电 Oracle 技术支持。
    java.lang.OutOfMemoryError
    at com.evermind.server.rmi.RMIConnection.EXCEPTION_ORIGINATES_FROM_THE_R
    EMOTE_SERVER(RMIConnection.java:1527)
    at com.evermind.server.rmi.RMIConnection.invokeMethod(RMIConnection.java
    :1480)
    at com.evermind.server.rmi.RemoteInvocationHandler.invoke(RemoteInvocati
    onHandler.java:55)
    at com.evermind.server.rmi.RecoverableRemoteInvocationHandler.invoke(Rec
    overableRemoteInvocationHandler.java:22)
    at __Proxy0.deploy(Unknown Source)
    at oracle.ias.sysmgmt.deployment.j2ee.runtime.LocalDeploy.deployOnSingle
    Instance(Unknown Source)
    at oracle.ias.sysmgmt.deployment.j2ee.runtime.LocalDeploy.doExecute(Unkn
    own Source)
    at oracle.ias.sysmgmt.deployment.j2ee.runtime.RuntimeIf.execute(Unknown
    Source)
    at oracle.ias.sysmgmt.deployment.j2ee.adapter.DeploymentAdapter.doEvalua
    teDeploy(Unknown Source)
    at oracle.ias.sysmgmt.deployment.j2ee.adapter.DeploymentAdapter.evaluate
    (Unknown Source)
    at oracle.ias.sysmgmt.task.TaskMaster.sync_evaluate(Unknown Source)
    at oracle.ias.sysmgmt.task.TaskMaster.internal_evaluate(Unknown Source)
    at oracle.ias.sysmgmt.task.RemoteEvaluate.execCommand(Unknown Source)
    at oracle.ias.sysmgmt.task.DaemonWorker.run(Unknown Source)
    I have modified dcmctl as
    if %found%==true %ex% -jar -Xms256M -Xmx1024M %jarpath% %cmdString%
    if %found%==false %ex% -jar -Xms256M -Xmx1024M %jarpath% %cmdString% -o %ohome%
    these are the last two lines in the dcmctl.bat file.
    I search on metalink and found this document:
    Subject: On Deploying an EAR file You Receive Errors: ADM-705003 and/or Memory Errors
    Doc ID: Note:290662.1 Type: PROBLEM
    Last Revision Date: 26-NOV-2004 Status: MODERATED
    The information in this document applies to:
    Web Services - Version: 9.0.4
    Oracle Containers for J2EE - Version: 9.0.0.0 to 10.10.10
    This problem can occur on any platform.
    j2ee deployments (WAR, EAR files) using dcmctl or dcmservlet
    Symptoms
    On deploying an EAR file, get this error on the server:
    ADMN-705003 and / or this error on server or client:
    java.lang.OutOfMemoryError
    Cause
    Too large EAR file
    Fix
    -- use compression in creating the archive
    NOTE: in jDeveloper, the default is not-to-compress. See the checkbox under the Options page of the deployment profile's Properties
    (right-cleck deployment profile icon, select Properties then find the Options page)
    - if error from client, increase jDeveloper deployment-client heap size:
    (right-cleck deployment profile icon, select Properties then find the Options page)
    - if error from server, increase iAS deployment-server heap size
    In EMWebsite for the target OC4J Node, select Server Properties and Java Options
    set to e.g., -XmX512M for 512M
    References
    Note 223063.1 - Installing 9iAS Fails While Deploying OC4J Applications With No space left on device Error
    but when I edit java Options any content,oc4j home instance can't startup,how can i do ?
    thank you very much
    lixinzhu

    Hi Li,
    You can have two parallel installations of Application Server, which I would strongly recommend. One instance of 10.1.2.0.2 (upgraded to 10.1.2.2) for Forms and Reports (and maybe Portal, Discoverer etc too), and one instance of 10.1.3.2 for your J2EE code.
    If you have multiple servers, then you can cluster those OC4J's, which is great. If you have two servers, you can as well have one for 10.1.2 and one for 10.1.3 making better performance over all. However, there's no problem having both 10.1.2 and 10.1.3 on the same server.
    The 10.1.3 server can use the Web Cache (as well as Apache) from the 10.1.2 server.
    Regards,
    Martin

  • Large adobe files created

    I use macros to create postscript files using a tektronix driver. Excel 2003 created small files, but since I have upgraded to 2007 the files are a lot larger (from 7mb to 77mb). Has anyone found a fix to this? Even when I convert an excel graph using adobe the file is a lot larger.

    When I originally set up this macro, the adobe pdf driver would not create the file properly. I have since tried to use the adobe driver (when this issue came up), but it still gives too large a file even when I changed the settings to compress it.

  • ERROR : OpenDoc CR to PDF - File is too large for attachment.

    We are getting the following error in 3.1 using an OpenDoc call when we call a large Crystal Report to PDF format...
    Error : 52cf6f8f4bbb6d3.pdf File is too large for attachment.
    It runs OK from BOE when given parameters that returned 44 pages. (PDF = 139 KB)
    We get the error on a parameter-set that returns 174 pages when run via CR Desktop or as a SCHEDULED Instance. (PDF = 446 KB).
    Client application can't use the SDKs to SCHEDULE Instances - only configured for OpenDoc calls.....
    The BOE server is running on SOLARIS - and it's is a 2 Server CMS-Cluster.
    The problem is SPORADIC, so I am thinking the issue is related to a specific setting on one of the servers.
    Any thoughts on where to start looking...?

    Problem is _not _with the number of Rows returned - it is an issue with the size of the PDF file that it is trying to move.
    Found a possible WINDOWS solution on BOB - need to find if there is an equivalent for SOLARIS...
    Check the dsws.properties on web server D:\Program Files\Business Objects\Tomcat55\webapps\dswsbobje\WEB-INF\classes
    See if you can change any parameter to remove size limitation.
    #Security measure to limit total upload file size
    maximumUploadFileSize = 10485760

  • PDF file size too large

    Hi,
    I have a report (6i and 9Ids) which contains an image (stored as a blob in the database (8i)). The size of the image in the database (and as a file) is just 750k. The image is sized to fit on to the A4 report page. If I set the desformat of this report to PDF the resulting PDF output file is 10mb in size. I need to make this report available over the web so this is too large. Has anyone got any ideas as to reducing the output file size?
    I have tried the pdfcomp report parameter with no joy.
    Cheers
    Andy

    Hi Andy,
    The image you are using might be a JPEG image. In 6i and 9i, while generating the PDF file, Oracle Reports always converts the image to GIF and embed it. This image type conversion increases the file size of the outputimage and hence PDF file size increases. This is fixed in Oracle Reports 10g.
    In Oracle Reports 10g, you can select the outputimageformat based on your need, using either:
    1. commandline: OUTPUTIMAGEFORMAT
    (or)
    2. environment variable: REPORTS_OUTPUTIMAGEFORMAT
    If your image in the database is a JPEG image, set the outputimageformat to JPEG. Hence, there will not be any image type conversion and the PDF file will be very small.
    Please refer to the Publishing Reports manual to know more about the usage of these commandline/environment variable.
    Links:
    http://download-west.oracle.com/docs/cd/B10464_01/bi.904/b10314/pbr_cla.htm#644163
    http://download-west.oracle.com/docs/cd/B10464_01/bi.904/b10314/pbr_rfap.htm#644448
    Thanks,
    Regards,
    Siva B

  • ReportExportControl  -- File is too large for attachment

    Hi all,
    Back with a exception again.
    Few reports deployed on BOXIR2 SP4 comes with an error as following
    com.crystaldecisions.report.web.viewer.ReportExportControl
    15bf5ea9377e1c1.rtf File is too large for attachment.
    Few reports take date range as input and generates result based on that. When date range is small report is generated without any problem. If is large and has large set of records(approx above 10K records) then comes up with the above error.
    Another set of reports whatever be the case throws this error.
    But when the parameters are set in CMS console and run in console report is generated without any problem what ever be the date range.
    After searching a lot no solutions in hand!!
    Please suggest solutions/possible scenarios / checklist to solve this issue.
    Thanks & Regards
    lnarayanan

    It turns out the report was so large it was basically overloading everything (disk space, cache sizes, timeouts, etc.) ... Here's the solution from the case:
    1) For page server and cache server
    Make sure location of Temp Files/Cache files has enough free space (more than 500 MB)
    2) For page server and cache server
    Minutes Before an Idle Connection is Closed = 90
    3) For page server
    Minutes Before an Idle Report Job is Closed = 90
    4) For cache server
    Maximum Simultaneous Processing Threads = automatic
    5) In the command line of page server and cache server, add
    "-requestTimeout 5400000" [without " "]
    6) Maximum Cache Size Allowed (KBytes) = set it to an appropriate value (for e.g. for 500 Mb, value should be 500*1024 = 512000)
    (This setting limits the amount of hard disk space used to store cached pages. If we need to handle large reports, a large cache size is needed. Maximum allowed is 2048 GB)
    After all this, an "Error 500 - Java heap space" happened.  The amount of memory allocated to a JVM application can be set using the options -Xms (the initial size) and -Xmx (the maximum size).
    Regards,
    Bryan

  • How can I insert a number of photos in a numbers doc without the file becoming too large?

    How can I insert a number of photos into a numbers doc without the file becoming too large?

    Use smaller photos.
    Seriously, reduce the file size of the photos in Preview, PhotoShop, iPhoto or other application before inserting tehem into the Numbers document.
    Regards,
    Barry

  • My mac will not copy more than one file at a time and gets locked up if the file is too large, my mac will not copy more than one file at a time and gets locked up if the file is too large

    my mac will not copy more than one file at a time and gets locked up if the file is too large, my mac will not copy more than one file at a time and gets locked up if the file is too large

    So now that you have repeated the same thing three times that doesn't make things any clearer at all.
    You are copying files from where to where?
    How are you attempting to copy files, software or click and drag?
    Any other detail would be helpful.
    Allan

  • Disk Utility: Creating a new blank image receiving "file too large" error.

    Hello All!
    I'm trying to create a 10GB non-encrypted, non-compressed RW blank image via the disk utility. The DU runs for a few minutes then barfs out "file too large" error. I have over 30GB free on my HDD. I tried with a smaller size of 6GB to no avail. Also tried unsuccessfully to create from a file (about 4 GB). My ultimate goal is to create a case-insensitive image to run an extremely important program needed for high priority work productivity (i.e. WoW). Thanks in advance for any advice! You will be my new best friend if you help me resolve this. =D
    Hollie
    "There are only 10 types of people in this world: Those who understand binary, and those who don't."

    Hi Hollie, and welcome to the forums!
    Have you created images before successfully?
    Is this to/on your boot drive, or an external drive?
    Have you done any Disk/OS maintenance lately?
    We might see if there are some big temp files left or such...
    How much free space is on the HD, where has all the space gone?
    OmniDiskSweeper is now free, and likely the best/easiest...
    http://www.omnigroup.com/applications/omnidisksweeper/
    WhatSize...
    http://www.macupdate.com/info.php/id/13006/
    Disk Inventory X...
    http://www.derlien.com/
    GrandPerspective...
    http://grandperspectiv.sourceforge.net/

  • Itunes library too large and won't recognize all files, just new ones but

    still shows the old ones and their location in the library. I was keeping the library on the 2TB external drive. It got too large. Now I want to put the music on one of the 1TB drives and just the videos on the 2TB drive. From my PC, I can watch everything.
    I cannot download to an iPod or iTouch anything from the 2TB drive. When I hit max capacity, I temporarily started saving on one of my internal 2TB drives (space wh*re, I know)before I hit on what my solution would be, and now I can only download from that location. Why?
    What can I do? I even tried switching the library location back, and it won't recognize anything. I'm screwed, and I have all this material... YEARS of collecting...
    Someone please help me...

    To clarify, currently on the 2TB external drive are music and videos. I want to split them up and hopefully save some space - and maybe foresee problems in the future when I hit max capacity on the 1TB external, and think about a 4GB (not the best option I see at this time) or another 2 2TB drives.
    Doesn't the Library file just save locations when iTunes closes down so everything should be OK? I mean, the prog can recognize everything, it just won't download to anything, even if I move from the external drive an older movie to the new drive, it refuses to "see" the movie so I can download it.

  • File too large error unpacking War during app deploy - RHEL &WLS 10.3.5

    I'm stumped and I'm hoping someone can help out here. Does anyone have any insights into the cause of my problem below, or tips on how to diagnose the cause?
    scenario
    We ran into an open files limit issue on our RH Linux servers, and had the SA boost the our open files limit fro 1024 to 3096. This seems to have solved the open files limit issue, once we restarted the node managers and the managed servers (our WLS startup script sets the soft limit to the hard limit).
    But now we've got a new issue, right after this change. The admin server is now no longer able to deploy and war/ear, as when I click on "Activate" after the install I get
    Message icon - Error An error occurred during activation of changes, please see the log for details.
    Message icon - Error Failed to load webapp: 'TemplateManagerAdmin-1.0-SNAPSHOT.war'
    Message icon - Error File too large
    on the console, and see the stack trace below in the Admin server log (nothing in the managed server logs) - indicating it's getting the error in exploding the war.
    I've tried both default deployment mode, and the mode "will make the deployment available in the following location" where the war is manually copied to the same location on each box, available to each server - all with the same result. I've also tried restarting the admin server, but no luck.
    The files are not overly large (<=34 MByte) and we had no trouble with them before today. I'm able to log in as the WebLogic user and copye files, etc. with no problem.
    There is no disk space issue - plenty of space left on all of our filesystems. There is, as far as I can tell, no OS or user file size limit issue:
         -bash-3.2$ ulimit -a
         core file size (blocks, -c) 0
         data seg size (kbytes, -d) unlimited
         scheduling priority (-e) 0
         file size (blocks, -f) unlimited
         pending signals (-i) 73728
         max locked memory (kbytes, -l) 32
         max memory size (kbytes, -m) unlimited
         open files (-n) 3096
         pipe size (512 bytes, -p) 8
         POSIX message queues (bytes, -q) 819200
         real-time priority (-r) 0
         stack size (kbytes, -s) 10240
         cpu time (seconds, -t) unlimited
         max user processes (-u) unlimited
         virtual memory (kbytes, -v) unlimited
         file locks (-x) unlimited
    environment
    WLS 10.3.5 64-bit
    Linux 64-bit RHEL 5.6
    Sun Hotspot 1.6.0_29 (64--bit)
    stack trace
    ####<Mar 6, 2013 4:09:33 PM EST> <Error> <Console> <nj09mhm5111> <prp_appsvcs_admin> <[ACTIVE] ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'> <steven_elkind> <> <> <1362604173724> <BEA-240003> <Console encountered the following error weblogic.application.ModuleException: Failed to load webapp: 'TemplateManagerAdmin-1.0-SNAPSHOT.war'
    at weblogic.servlet.internal.WebAppModule.prepare(WebAppModule.java:393)
    at weblogic.application.internal.flow.ScopedModuleDriver.prepare(ScopedModuleDriver.java:176)
    at weblogic.application.internal.flow.ModuleListenerInvoker.prepare(ModuleListenerInvoker.java:199)
    at weblogic.application.internal.flow.DeploymentCallbackFlow$1.next(DeploymentCallbackFlow.java:517)
    at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:52)
    at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:159)
    at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:45)
    at weblogic.application.internal.BaseDeployment$1.next(BaseDeployment.java:613)
    at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:52)
    at weblogic.application.internal.BaseDeployment.prepare(BaseDeployment.java:184)
    at weblogic.application.internal.SingleModuleDeployment.prepare(SingleModuleDeployment.java:43)
    at weblogic.application.internal.DeploymentStateChecker.prepare(DeploymentStateChecker.java:154)
    at weblogic.deploy.internal.targetserver.AppContainerInvoker.prepare(AppContainerInvoker.java:60)
    at weblogic.deploy.internal.targetserver.operations.ActivateOperation.createAndPrepareContainer(ActivateOperation.java:207)
    at weblogic.deploy.internal.targetserver.operations.ActivateOperation.doPrepare(ActivateOperation.java:98)
    at weblogic.deploy.internal.targetserver.operations.AbstractOperation.prepare(AbstractOperation.java:217)
    at weblogic.deploy.internal.targetserver.DeploymentManager.handleDeploymentPrepare(DeploymentManager.java:747)
    at weblogic.deploy.internal.targetserver.DeploymentManager.prepareDeploymentList(DeploymentManager.java:1216)
    at weblogic.deploy.internal.targetserver.DeploymentManager.handlePrepare(DeploymentManager.java:250)
    at weblogic.deploy.internal.targetserver.DeploymentServiceDispatcher.prepare(DeploymentServiceDispatcher.java:159)
    at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer.doPrepareCallback(DeploymentReceiverCallbackDeliverer.java:171)
    at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer.access$000(DeploymentReceiverCallbackDeliverer.java:13)
    at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer$1.run(DeploymentReceiverCallbackDeliverer.java:46)
    at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:528)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    Caused by: java.io.IOException: File too large
    at java.io.FileOutputStream.writeBytes(Native Method)
    at java.io.FileOutputStream.write(FileOutputStream.java:282)
    at weblogic.utils.io.StreamUtils.writeTo(StreamUtils.java:19)
    at weblogic.utils.FileUtils.writeToFile(FileUtils.java:117)
    at weblogic.utils.jars.JarFileUtils.extract(JarFileUtils.java:285)
    at weblogic.servlet.internal.ArchivedWar.expandWarFileIntoDirectory(ArchivedWar.java:139)
    at weblogic.servlet.internal.ArchivedWar.extractWarFile(ArchivedWar.java:108)
    at weblogic.servlet.internal.ArchivedWar.<init>(ArchivedWar.java:57)
    at weblogic.servlet.internal.War.makeExplodedJar(War.java:1093)
    at weblogic.servlet.internal.War.<init>(War.java:186)
    at weblogic.servlet.internal.WebAppServletContext.processDocroot(WebAppServletContext.java:2789)
    at weblogic.servlet.internal.WebAppServletContext.setDocroot(WebAppServletContext.java:2666)
    at weblogic.servlet.internal.WebAppServletContext.<init>(WebAppServletContext.java:413)
    at weblogic.servlet.internal.WebAppServletContext.<init>(WebAppServletContext.java:493)
    at weblogic.servlet.internal.HttpServer.loadWebApp(HttpServer.java:418)
    at weblogic.servlet.internal.WebAppModule.registerWebApp(WebAppModule.java:972)
    at weblogic.servlet.internal.WebAppModule.prepare(WebAppModule.java:382)

    In the end, the problem was not in the Admin server where the log entry is, but in one of the managed servers where there was no such log entry.
    Somehow, and we have no idea how, the NodeManager process had the soft limit for max file size set to 2k blocks. Thus, the managed server inherited that. We restarted the Node Manager, then the managed server, and the problem went away.
    The diagnostic that turned the trick:
    cat /proc/<pid>/limits
    for the managed server showed the bad limit setting, then diagnosis proceeded from there. The admin server, of course, had "unlimited" since it was not the source of the problem.

  • My time Machine keeps saying, "Time Machine could not complete the backup. This backup is too large for the backup disk. The backup requires 345.74 GB but only 289.80 are available." I have already excluded files. I have a 1tb external drive. HELP!!!

    For over two weeks now I have been frustated and not having my TIme Machine back up to my 1tb external drive. I dont understand why now its a problem.  It keeps saying
    "This backup is too large for the backup disk. The backup requires 345.74GB but only 289.80GB are avialable.  Time Machine needs work space on the bakup disk, in addition to the space required to store backups. Open Time Machine preferences to select a large backup disk or make the bakup smaller by excluding files." So I have already excluded almost all of my files, and even deleted the backup disk yet, that quote still keeps popping up. I am truly at a wall with this. I have a Mac OS X version 10.7.5. CAN SOMEONE HELP ME PLEASE????

    If you have more than one user account, these instructions must be carried out as an administrator.
    Launch the Console application in any of the following ways:
    ☞ Enter the first few letters of its name into a Spotlight search. Select it in the results (it should be at the top.)
    ☞ In the Finder, select Go ▹ Utilities from the menu bar, or press the key combination shift-command-U. The application is in the folder that opens.
    ☞ Open LaunchPad. Click Utilities, then Console in the icon grid.
    Make sure the title of the Console window is All Messages. If it isn't, select All Messages from the SYSTEM LOG QUERIES menu on the left. If you don't see that menu, select
    View ▹ Show Log List
    from the menu bar.
    Enter the word "Starting" (without the quotes) in the String Matching text field. You should now see log messages with the words "Starting * backup," where * represents any of the words "automatic," "manual," or "standard." Note the timestamp of the last such message. Clear the text field and scroll back in the log to that time. Select the messages timestamped from then until the end of the backup, or the end of the log if that's not clear. Copy them (command-C) to the Clipboard. Paste (command-V) into a reply to this message.
    If there are runs of repeated messages, post only one example of each. Don't post many repetitions of the same message.
    When posting a log extract, be selective. Don't post more than is requested.
    Please do not indiscriminately dump thousands of lines from the log into this discussion.
    Some personal information, such as the names of your files, may be included — anonymize before posting.

  • Why can't I import anything? "File video dimensions (width/height) too large"

    New to PP CS4 here (that will be obvious momentarily...).  I can't seem to import any footage into any project (well, at least none of the bits I've tried, and I don't think there's anything 'wrong' with them).  Every time I attempt to do so, I get the error:
    Dialog title: File Import Failure
    Error Message: File video dimensions (width/height) too large.
    This occurs with every files I've thrown at it - admittedly, they all have a lot in common, but they're not particularly exotic.
    The files are (mostly) AVI files (I know, I know...before I get jumped, I understand it's just a wrapper, etc.). The files come from two cameras:
    1. A Sony DCR-TRV20. Opening up the AVI in QuickTime Player, I see it's listed as "DV/DVCPRO - NTSC 640X480"
    2. Canon PowerShot A620. Opening up the AVI in QuickTime Player, I see it's listed as "MotionJPEG OpenDML 320x240"
    3. I somehow managed to get the AME to spit out a FLV file. That failed to import with the same error message, too.
    I tried running these files through AME, and got the same results (although never having used AME previously, I have a question on that, below).
    New project, settings are all (I think) the defaults: Preset DV-NTSC/Standard 32kHz
    New installation of PP CS4, with the 4.1 update, on a PC running Vista 32 with all the endless MS updates applied.
    Related question: I've skimmed other threads here, and noted the advice to "always convert any non-standard footage to DV-AVI Type II" before importing....however, I see no such option for that output format in AME....only generic things like "Microsoft AVI" and "Uncompressed Microsoft AVI".
    Codec issues?  I'm a little skeptical, since these AVI files play just fine in Windows if I double-click them. Does that not suggest that there's a codec there for them?  The files are from (what I believe to be) garden-variety consumer cameras (these devices are a bit old and certainly not high-end, but they're hardly exotic formats, I think). The files are small, short clips I'm attempting to use just to learn Premiere, but it gags on every one of them, every time.
    PP is so insistent that every file I throw at it has improper dimensions, but these files appear to be perfectly reasonably sized, and I assume it's operator error (I'm withholding judgement on the quality and accuracy of the error string it's presenting me).  But I'm not sure where to look for the right knob to twist.  Pretty frustrating to be stuck on Square 1 (actually, Square Zero).
    Can someone point me in the right direction?
    Please be gentle, it's my first time....

    Thanks, but....mmmm....maybe not.
    I've tried multiple settings for New Projects. Just created a new one using the preset for DV-NTSC/Widecsreen 48kHz (frame size 720 x 480, 48 kHz audio).  Exactly the same results.
    You say "To point you in a certain direction, DV from a TRV is 720x480, not 640x480."  However, according to QuickTime's Movie Inspector panel, the file is 640x480.  This clip was captured through Premiere - that wouldn't change its aspect ratio, would it?
    The files from the Canon still camera are AVI files, not still images (so I don't see how the reference to Photoshop applies, or perhaps I misunderstood your point). They are 320x240, according to QuickTime's Movie Inspector panel. If they're 320x240, wouldn't that be the same aspect ratio as 640x480?  Attempting to import them, I get the same error....
    While not exactly Hollywood quality, I just need files to play with as I attempt to learn the program, and it's not clear to me why these files should not work - other than the fact that NO files seem to work. Also, I'm not sure what your reference to "exporting" pertains to - I can't get anything imported, I'm not trying to export anything that (I'm aware of) - I'll worry about exporting once I've managed to get something, ANYTHING to import.
    I've got some PP training videos (from "you-probably-know-who.com"). Following along using their project files and their video assets (MOV files in this case), I still get the exact same error.
    So to summarize, so far, I've seen no evidence that this copy of PP can import ANY video file.
    While I'm sure I would benefit from spending more time with the fine manual, I think there's something wrong here that's not going to be addressed in the introductory documentation.  If I can't import any footage into any project, progress is going to be slow.
    How about this: is there some known-to-be-good test file I can grab and try importing that?  I've got very strong suspicions that I'll get the same error message.
    Thanks again.

Maybe you are looking for

  • HP1102W, XP, asking for username and password

    I'm setting up my wireless printer, and it is asking for a username and password? The wireless printer is set up and shows a status of online when I look at the printer in the browser.  I recently reinstalled windows and everything on one PC. I am tr

  • I cannot find FaceTime on New iphone5

    i restored Content from iphone 3 to iphone5 but cannot find the FaceTime app. My settings Show that I have it but I cant See it on my screen. Any ideas?

  • Flex 4 MXMLC Problem With Modules

    I am trying to configure my Ant build script to compile modules, but I get the following errors: /Users/loc_admin/ContinuousIntegration/CruiseControl/projects/leadlaw/source/src/com/rocki ngmm/leadlaw/modules/News.mxml(33): Error: Could not resolve <

  • Configuring Lookup in oracle HRMS

    Hi All, I have a doubt in look up as well as Vlaue set. Can i create a new look up? How can i create that and also need to configure that to my customize form. Aslo wanta to do same thing in Vlaue set as well. could any one please help me on this. Th

  • Date Format amd Sorting

    In my sql query I have a statement like this to_char(TRDATE,'DD-Mon-YYYY'). I am converting this to character to display in DD-Mon-YYYY format. Since it is character and when user tries to sort in the report, it is not sorting based on date. So if I