Unstable with large cp files

I'm evaluating Captivate, using version 1.0.1188.1188. I'm
creating a fairly long, full-screen (1280X1024) training course
using Demonstration mode with over 300 slides. The CP file is about
100MB in size. I have noticed that Captivate often saves corrupt
files that are much smaller than normal: one had no audio;
Captivate failed to read another back into memory. Once Captivate
failed to display any menus; another time, when selecting File /
Save As and entering the new file name, Captivate didn't save the
file, but didn't report any error. I have 1GB RAM and 3GHz CPU and
30GB hard disk space. The Windows Task Manager shows around 400MB
free RAM.
I wonder if Captivate doesn't behave well with large files,
although I did not think that my training session I'm creating is
very long or complex. It is the first I've ever attempted to do, so
I'm inexperienced in this.
Do others have similar problems? Should I break my project
into several smaller ones?
Tony

Steve,
I am unfamiliar with the 'end of movie" preference you
mentioned. I am having problems with the "consistent" playback of
nested swf files in my application. There's only about 25 slides
but two of the 3 flash animations are consistently inconsistent -
that is, sometimes they load, sometimes they don't. I was pretty
sure I isolated the problem to Internet Explorer 6 (since it worked
fine in other browsers and the Flash player). If this "end of
movie" preferences option allows me to cut this cp file in half, it
could be helpful in isolating my problem.
Bill

Similar Messages

  • What is best way dealing with large tiff file in OSX Lion?

    I'm working with large tiff  file (engineering drawing), but preview couldnt handle (unresponsive) this situation.
    What is best way dealing with large tiff file in OSX Lion? (Viewing only or simple editing)
    Thx,
    54n9471

    Use an iPad and this app http://itunes.apple.com/WebObjects/MZStore.woa/wa/viewSoftware?id=400600005&mt=8

  • Speed up Illustrator CC when working with large vector files

    Raster (mainly) files up to 350 Mb. run fast in Illustrator CC, while vector files of 10 Mb. are a pain in the *blieb* (eg. zooming & panning). When reading the file it seems to freeze around 95 % for a few minutes. Memory usage goes up to 6 Gb. Processor usage 30 - 50 %.
    Are there ways to speed things up while working with large vector files in Illustrator CC?
    System:
    64 bit Windows 7 enterprise
    Memory: 16 Gb
    Processor: Intel Xeon 3,7 GHz (8 threads)
    Graphics: nVidia Geforce K4000

    Files with large amounts vector points will put a strain on the fastest of computers. But any type of speed increase we can get you can save you lots of time.
    Delete any unwanted stray points using  Select >> Object >> stray points
    Optimize performance | Windows
    Did you draw this yourself, is the file as clean as can be? Are there any repeated paths underneath your art which do not need to be there from live tracing or stock art sites?
    Check the control panel >> programs and features and sort by installed recently and uninstall anything suspicious.
    Sorry there will be no short or single answer to this, as per the previous poster using layers effectively, and working in outline mode when possible might the best you can do.

  • Working with large Artboards/Files in Illustrator

    Hello all!
    I'm currently designing a full size film poster for a client. The dimensions of the poster are 27" x 40" (industry standard film poster).
    I am a little uncertain in working with large files in Illustrator, so I have several problems that have come up in several design projects using similar large formats.
    The file size is MASSIVE. This poster uses several large, high-res images that I've embedded. I didn't want them to pixelate, so I made sure they were high quality. After embedding all these images, along with the vector graphics, the entire .ai file is 500MB. How can I reduce this file size? Can I do something with the images to make the .ai file smaller?
    I made my artboard 27" x 40" - the final size of the poster. Is this standard practice? Or when designing for a large print format, are you supposed to make a smaller, more manageable artboard size, and then scale up after to avoid these massive file sizes?
    I need to upload my support files for the project, including .ai and .eps - so it won't work if they're 500MB. This would be good info to understand for all projects I think.
    Any help with this would be appreciated. I can't seem to find any coherent information pertaining to this problem that seems to address my particular issues. Thank you very much!
    Asher

    Hi Asher,
    It's probably those high-res images you've embedded. Firstly, be sure your images are only as large as you need them Secondly, a solution would be to use linked images while you're working instead of embedding them into the file.
    Here is a link to a forum with a lot of great discussion about this issue, to get you started: http://www.cartotalk.com/lofiversion/index.php?t126.html
    And another: http://www.graphicdesignforum.com/forum/archive/index.php/t-1907.html
    Here is a great list of tips that someone in the above forum gave:
    -Properly scale files.  Do not take a 6x6' file then use the scaling tool to make it a 2x2'  Instead scale it in photoshop to 2x2 and reimport it.  Make a rule like over 20%, bring it back in to photoshop for rescaling.
    -Check resolutions. 600dpi is not going to be necessary for such and such printer.
    -Delete unused art.  Sloppy artists may leave old unused images under another image.  The old one is not being used but it still takes up space, therefore unecessarily inflating your file.
    -Choose to link instead of embedd.  This is your choice.  Either way you still have to send a large file but many times linking is less total MB then embedding.  Also embedding works well with duplicated images. That way multiple uses link to one original, whereas embedding would make copies.
    -When you are done, using compression software like ZIP or SIT (stuffit)
    http://www.maczipit.com/
    Compression can reduce file sizes alot, depending on the files.
    This business deals with alot of large files.  Generally people use FTP's to send large files, or plain old CD.  Another option is using segmented compression.  Something like winRAR/macRAR or dropsegment (peice of stuffit deluxe) compresses files, then breaks it up into smaller manageble pieces.   This way you can break up a 50mb file into say 10x 5mb pieces and send them 5mb at a time. 
    http://www.rarlab.com/download.htm
    *make sure your client knows how to uncompress those files.  You may want to link them to the site to download the software."
    Good luck!

  • Software available for working with large video files?

    Hello,
    I'm working in PP CS6. I was wondering if there are any workarounds or 3rd party plugins/software that
    make working with really large video files easier and faster?
    Thanks.
    Mark

    Hi Jeff,
    Thanks for helping. This is the first time I shot video with my Nikon D5200. It was only a 3 minute test clip
    set at the highest resolution, 1920x1080-60i. I saw the red line above the clip in PP CS6 and hit the enter
    key to render the clip.
    It took almost 18 minutes or so to render the clip. This is probably normal but I was wondering if there is
    a way to reduce the file size so it doesn't take quite as long to render. I just remember a few years back
    that when the Red camera was out, guys were working with really huge files and there was a program
    from Cine something that they used to reduce the file size and make it more manageable when editing.
    I could be mistaken. I've been out of the editing look for a few years and just getting back into it.
    Thanks.
    Mark
    Here's my PC's components list you asked for:
    VisionDAW 4U 8-Core Xeon Workstation
      2 Intel QUAD-Core Xeon 5365-3.0GHz, 8MB, 1333MHz Processors
      16GB 667MHz Fully Buffered Server Memory Modules (2x2GB)
      Microsoft® Windows® Windows 7 Ultimate (x64)
      WDC 250GB, Ultra ATA100, 7200 rpm, 8MG Buffer Main OS HD
      2 WWDC 750GB, SATA II, 7200 RPM, 16MB Buffer HD (Raid 0)
      2 WDC 750GB, SATA II, 7200 rpm, 16MG Buffer HD (Samples)
      2 WDC 1TB Enterprise Class, SATA II, 7200 RPM, 32MB Buffer Hard Drive
      MOTU 24 I/O (main) / MOTU 2408mk3 (slave)
      Plexor PX-800A 18X Dbl. Layer DVD+/-RW Optical Drive
      Buffalo BuRay Drive (External) BR-816SU2 
      Front Panel USB Acess
      Integrated FireWire (1394a) interface
      Thermaltake Toughpower 850W Power Supply
      3xUAD1 Universal Audio Cards
      NVIDIA QUADRO FX 1800 / Memory 768 MB GDDR3
      CUDA Parallel Processor Cores / 64
      Maximum Display Resolution Digital @60Hz = 2560x1600
      Memory Interface 192Bit
      Memory Bandwidth (GB/sec) / 38.4 GB/sec
      PCI-Express, DUAL-Link DVI 1
      Digital Outputs 3 (2 out of 3 active at a time)
      Dual 25.5" Samsung 2693HM LCD HD Monitors

  • Speeding up a dualcore G5 to work with large Photoshop files

    Hi
    I have a 2.3Ghz dualcore G5 (the one that came out late 2005).
    I'm working on a bunch of large format film scans (500Mb +) in Photoshop CS2, and I'm trying to speed things up.
    This last week I've installed two upgrades that have helped get things moving - first I upgraded the RAM from 4.5Gb to 12 Gb, then I installed a second hard drive (Seagate 7200.11 500Gb, with jumper in place) to use as a dedicated scratch disk.
    Both upgrades have given a significant speed boost, but I'm now wondering what else I can do????
    I want to speed up the time that it takes to open and save these large scans.
    My first thought was to buy a second Seagate 500Gb drive as a replacement for the original 250Gb WD drive. I would then have two 500Gb internal drives that I could configure as a RAID 0 array with disk utility. I would then clone my original 250Gb onto the RAID 0 drives with Super Duper.
    Wouldn't such a set-up double the speed of opening and saving large Photoshop files?
    I realise that with RAID 0 there is an increased chance of data loss from disk failure (double the chance?), but if I back up daily to a 1Tb external that should be ok, no?
    Or should my next move to be to utilise the PCI-E slots (which I don't really understand)????
    Thanks for any advice you can offer.
    Richard

    In my G5 Quad, I find the fastest Photoshop performance overall- especially with large-file open and saves, occurs when the setup is as follows:
    Startup disk is a 2 x 150G Raptor RAID0 which contains the system, apps, and all user files- including the image file being worked on.
    PS scratch is then pointed either to a third fast drive or to an external RAID0 array. This setup is substantially faster than:
    ....the safer method of using one Raptor for system and apps, and the other for user and image files, with a third drive for PS scratch.
    With a really fast large scratch disk, you can sometimes put the image file on the same volume as the scratch file, open and save it from there, and you'll get a bit of an additional boost so long as the scratch disk is faster than the startup disk.
    For CS3 (I believe it works for CS2 as well), a performance plugin called DisableScratchCompress.plugin is available which can also speed things up with a fast scratch disk. I found that using this plugin speeded things up but only if the image file was opened/saved from the first config I mentioned; it was slower if placed on the scratch disk while using the DSC plugin.
    More here: Photoshop Acceleration Basics
    Of course if you stripe a disk with data, be sure to frequently back it up..:)

  • Flex file upload issue with large image files

         Hello, I have created a sample flex application to upload an image and also created java servlet to upload and save image and deployed in local tomcat server. I am testing the application in LAN. I am able to upload small as well as large image file(1Mb) from some PCs but in some other PCs I am getting IOError while uploading large image files however it is working fine for small images. Image uploading is hanging after 10%-20% and throwing IOError. *Surprizgly it is working Ok with XP systems and causeing issues with Windows7 systems*.
    Plz give me any idea to get a solution.
    In Tomcat server side it is giving following error:
    request: org.apache.catalina.connector.RequestFacade@c19694
    org.apache.commons.fileupload.FileUploadBase$IOFileUploadException: Processing of multipart/form-data request failed. Stream ended unexpectedly
            at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:371)
            at org.apache.commons.fileupload.servlet.ServletFileUpload.parseRequest(ServletFileUpload.ja va:126)
            at flex.servlets.UploadImage.doPost(UploadImage.java:47)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
            at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.j ava:290)
            at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
            at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
            at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
            at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
            at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
            at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
            at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
            at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:877)
            at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProto col.java:594)
            at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1675)
            at java.lang.Thread.run(Thread.java:722)
    Caused by: org.apache.commons.fileupload.MultipartStream$MalformedStreamException: Stream ended unexpectedly
            at org.apache.commons.fileupload.MultipartStream$ItemInputStream.makeAvailable(MultipartStre am.java:982)
            at org.apache.commons.fileupload.MultipartStream$ItemInputStream.read(MultipartStream.java:8 86)
            at java.io.InputStream.read(InputStream.java:101)
            at org.apache.commons.fileupload.util.Streams.copy(Streams.java:96)
            at org.apache.commons.fileupload.util.Streams.copy(Streams.java:66)
            at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:366)
    UploadImage.java:
    package flex.servlets;
    import java.io.*;
    import java.sql.*;
    import java.util.*;
    import java.text.*;
    import java.util.regex.*;
    import org.apache.commons.fileupload.servlet.ServletFileUpload;
    import org.apache.commons.fileupload.disk.DiskFileItemFactory;
    import org.apache.commons.fileupload.*;
    import sun.reflect.ReflectionFactory.GetReflectionFactoryAction;
    import javax.servlet.*;
    import javax.servlet.http.*;
    public class UploadImage extends HttpServlet{
             * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse
             *      response)
            protected void doGet(HttpServletRequest request,
                            HttpServletResponse response) throws ServletException, IOException {
                    // TODO Auto-generated method stub
                    doPost(request, response);
            public void doPost(HttpServletRequest request,
                            HttpServletResponse response)
            throws ServletException, IOException {
                    PrintWriter out = response.getWriter();
                    boolean isMultipart = ServletFileUpload.isMultipartContent(
                                    request);
                    System.out.println("request: "+request);
                    if (!isMultipart) {
                            System.out.println("File Not Uploaded");
                    } else {
                            FileItemFactory factory = new DiskFileItemFactory();
                            ServletFileUpload upload = new ServletFileUpload(factory);
                            List items = null;
                            try {
                                    items = upload.parseRequest(request);
                                    System.out.println("items: "+items);
                            } catch (FileUploadException e) {
                                    e.printStackTrace();
                            Iterator itr = items.iterator();
                            while (itr.hasNext()) {
                                    FileItem item = (FileItem) itr.next();
                                    if (item.isFormField()){
                                            String name = item.getFieldName();
                                            System.out.println("name: "+name);
                                            String value = item.getString();
                                            System.out.println("value: "+value);
                                    } else {
                                            try {
                                                    String itemName = item.getName();
                                                    Random generator = new Random();
                                                    int r = Math.abs(generator.nextInt());
                                                    String reg = "[.*]";
                                                    String replacingtext = "";
                                                    System.out.println("Text before replacing is:-" +
                                                                    itemName);
                                                    Pattern pattern = Pattern.compile(reg);
                                                    Matcher matcher = pattern.matcher(itemName);
                                                    StringBuffer buffer = new StringBuffer();
                                                    while (matcher.find()) {
                                                            matcher.appendReplacement(buffer, replacingtext);
                                                    int IndexOf = itemName.indexOf(".");
                                                    String domainName = itemName.substring(IndexOf);
                                                    System.out.println("domainName: "+domainName);
                                                    String finalimage = buffer.toString()+"_"+r+domainName;
                                                    System.out.println("Final Image==="+finalimage);
                                                    File savedFile = new File(getServletContext().getRealPath("assets/images/")+"/LowesFloorPlan.png");
                                                    //File savedFile = new File("D:/apache-tomcat-6.0.35/webapps/ROOT/example/"+"\\test.jpeg");
                                                    item.write(savedFile);
                                                    out.println("<html>");
                                                    out.println("<body>");
                                                    out.println("<table><tr><td>");
                                                    out.println("");
                                                    out.println("</td></tr></table>");
                                                    try {
                                                            out.println("image inserted successfully");
                                                            out.println("</body>");
                                                            out.println("</html>");  
                                                    } catch (Exception e) {
                                                            System.out.println(e.getMessage());
                                                    } finally {
                                            } catch (Exception e) {
                                                    e.printStackTrace();

    It is only coming in Windows 7 systems and the root of this problem is SSL certificate.
    Workaround for this:
    Open application in IE and click on certificate error link at address bar . Click install certificate and you are done..
    happy programming.
    Thanks
    DevSachin

  • Premiere Pro 2.0 slows when dealing with larger video files

    I'm having issues with Premiere Pro 2.0 slowing to a crawl and taking 60-90 seconds to come back to life when dealing with larger .avi's (8+ mins). When I try to play a clip on the timeline, drag the slider over said clip, or play from a title into said clip on the timeline, Premiere hangs. The clips on question are all rendered, and the peak file has been generated for each different clip has well. This is a new problem; the last time I was working with a larger clip (45+ mins, captured from a Hi-8 cam), I had no problems. Now, I experience this slow down with all longer clips, although I've only dealt with footage captured from a Hi-8 cam and also a mini-DV cam. This problem has made Premiere nearly unusable. I'm desperate at this point.
    System:
    CPU: P4 HT 2.4ghz
    Ram: 2x 1gb DDR
    Video: ATI Radeon 9000 Series
    Scratch Disk: 250gb WD My Book - USB 2.0 (I suspect this might be part of the problem)
    OS: XP Pro SP2
    I'm not on my machine right now, and I can definitely provide more information if needed.
    Thanks in advance.

    Aside from some other issues, I found that USB was just not suited for editing to/from, and on a much faster machine, that you list.
    FW-400 was only slightly better. It took FW-800, before I could actually use the externals for anything more than storage, i.e. no editing, just archiving.
    eSATA would be even better/faster.
    Please see Harm's ARTICLES on hardware, before you begin investing.
    Good luck,
    Hunt
    [Edit] Oops, I see that Harm DID link to his articles. Missed that. Still, it is worth mentioning again.
    Also, as an aside, PrPro 2.0 has no problem on my workstation when working with several 2 hour DV-AVI's, even when these are edited to/from FW-800 externals.
    Message was edited by: the_wine_snob - [Edit]

  • Does the parser work with large XML files?

    Is there a restriction on the XML file size that can be loaded into the parser?
    I am getting a out of memory exception reading in large XML file(10MB) using the commands
    DOMParser parser = new DOMParser();
    URL url = createURL(argv[0]);
    parser.setErrorStream(System.err);
    parser.setValidationMode(true);
    parser.showWarnings(true);
    parser.parse(url);
    Win NT 4.0 Server
    Sun JDK 1.2.2
    ===================================
    Error output
    ===================================
    Exception in thread "main" java.lang.OutOfMemoryError
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at java.util.Hashtable.<init>(Unknown Source)
    at oracle.xml.parser.v2.DTDDecl.<init>(DTDDecl.java, Compiled Code)
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at oracle.xml.parser.v2.ValidatingParser.checkDefaultAttributes(Validati
    ngParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseAttributes(NonValidatin
    gParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingPa
    rser.java, Compiled Code)
    at oracle.xml.parser.v2.ValidatingParser.parseRootElement(ValidatingPars
    er.java:97)
    at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingP
    arser.java:199)
    at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:146)
    at TestLF.main(TestLF.java:40)
    null

    We have a number of test files that are that size and it works without a problem. However using the DOMParser does require significantly more memory than your doc size.
    What is the memory configuration of the JVM that you are running with? Have you tried increasing it? Are you using our latest version 2.0.2.6?
    Oracle XML Team

  • Problems with Large XML files

    I have tried increasing the memory pool using the -mx and -ms options. It doesnt work. I am using your latest XML parser for Java v2. Please let me know if there are some specific options I should be using.
    Thanx,
    -Sameer
    We have a number of test files that are that size and it works without a problem. However using the DOMParser does require significantly more memory than your doc size.
    What is the memory configuration of the JVM that you are running with? Have you tried increasing it? Are you using our latest version 2.0.2.6?
    Oracle XML Team
    Is there a restriction on the XML file size that can be loaded into the parser?
    I am getting a out of memory exception reading in large XML file(10MB) using the commands
    DOMParser parser = new DOMParser();
    URL url = createURL(argv[0]);
    parser.setErrorStream(System.err);
    parser.setValidationMode(true);
    parser.showWarnings(true);
    parser.parse(url);
    Win NT 4.0 Server
    Sun JDK 1.2.2
    ===================================
    Error output
    ===================================
    Exception in thread "main" java.lang.OutOfMemoryError
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at java.util.Hashtable.<init>(Unknown Source)
    at oracle.xml.parser.v2.DTDDecl.<init>(DTDDecl.java, Compiled Code)
    at oracle.xml.parser.v2.ElementDecl.getAttrDecls(ElementDecl.java, Compi
    led Code)
    at oracle.xml.parser.v2.ValidatingParser.checkDefaultAttributes(Validati
    ngParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseAttributes(NonValidatin
    gParser.java, Compiled Code)
    at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingPa
    rser.java, Compiled Code)
    at oracle.xml.parser.v2.ValidatingParser.parseRootElement(ValidatingPars
    er.java:97)
    at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingP
    arser.java:199)
    at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:146)
    at TestLF.main(TestLF.java:40)
    null

    You might try using a different JDK/JRE - either a 1.1.6+ or 1.3 version as 1.2 in our experience has the largest footprint. If this doesn't work can you give us some details about your system configuration. Finally you might try the SAX interface as it does not need to load the entire DOM tree into memory.
    Oracle XML Team

  • Music lost replaced with large 'other' file

    Hi.
    All my music disappeared from my 3gs which now has a large amount of 'other' files - I assume the corrupt music files.
    I'm having to re-sync my music but how do I get rid of the large other files. DO I need to restore from a back up?
    Ta

    longestdrive wrote:
    how do I get rid of the large other files. DO I need to restore from a back up?
    That will do it. This article gives all the details and shows you the screens you will see during the restore: http://support.apple.com/kb/HT1414

  • Problem with large text-files, HOWTO?

    Hi!
    I'm making an application witch shall search through a dir with 3000 html-files, and find all links in those files.
    I have a text files with the format:
    file1: linktofile:linktofile6:linktofile5
    file2: linktofile1:linktofile87:
    and so on.
    This file shall then be searched when I'm pressing hyperlinks in IExplorer. The problem is that this file is VERY long both "horizontally and vertical". Is there a clever way to shorten it?

    If you have to search the entire contents of all 3000 files every time, then I don't see how that could be shortened. But if you have to search those files only for instances of "linktofile1295", for example, then you could redesign your text files into a database where you could access those instances directly via an index.

  • I deal with large TextEdit files ( 5 Gb) regularly-- autosave is eating my harddrive!

    My harddrive is now composed of >160 Gb of "other" since using multiple large TextEdit files for genotyping work.
    How can I delete these files?  Where are they??  I have nothing left to delete!

    My harddrive is now composed of >160 Gb of "other" since using multiple large TextEdit files for genotyping work.
    How can I delete these files?  Where are they??  I have nothing left to delete!

  • Problem on Macbook Pro (16GB RAM, i7) with large video files

    Hallo,
    I have a recurring problem working with Premiere Pro CS6 on a new Macbook. THe Macbook has16GM RAM, i7 processor and I have the video files on my internal disc. THere is enaough space on the disk, aprox 250 GB free.
    I have a Canon Legria HF28. After loading the videos from the camara it's .MOV files. When I have files about 1 GB it seems to work fine. But if I have a large file, e.g. 1 hour HD video in 3,5 GB size, it loads the video, but the playback and any kind of edits does not work properly. Sometimes I am able to start the video, but it stops after a few minutes and I can't rewind or move forward. In the most cases Adobe Premiere crashes. But also restart or opening in new projects does not help. I wonder if somebody has similar problems. My understanding is, that I have enought RAM, and fast processor, that it should actually work fine. The video files are OK. As they run in Quicktime with no problem. Also moving back and forth in the video is fine in a simple player, but not in Premiere.
    Any ideas? And thank you for your help in advance.
    Thomas

    My 2011 MBP can take 16 GB RAM.  macsales.com sells the 16 GB kit for $160 currently.  Apple tests against the more commonly available RAM types when publishing hardware limits, so only reports 8 GB as a maximum.
    8 GB RAM is abut $50 through macsales.com, so buy the default 4 GB regardless of where you decide to go.

  • Problems with large Photoshop files CC 2014

    Hi,
    Having a strange issue with the file size of some psd's.
    I have two basicly identical images. They are both the same size (same ppi och same mesurments in cm). Both of them are singel layer and there are no paths or other hidden stuff. Both are in sRGB as well. The issue here is that the file on the left is about 12 MB and the one on the right is almost twice that and I can't for the life of me figure out why? As far as I'm concerned the image on left looks a bit more "advanced" and should be the bigger one.
    This isn't just this image but most of the images i've been working on for the last couple of weeks. Chose this one becuse it was easy to compare to an older image around the same size.
    Also recived some images from Samsung a couple a days ago wich where the same size as these ones but where 135 MB!!! After flattening the image to one layer it went down to about 25-27 MB but that still feels a alot to me.
    Has there been any changes to the way Photoshop handles the file size in the latest verison (CC 2014)?
    And befor anyone asks, yes they are saved for maximised compability.
    Any idea's? Sorry for the bad english by the way

    No, there are no changes to the handling of large files, or files in general.
    But mistakes in bit depth, cropping, layers, etc. could explain a file size difference for files that look sort of the same.

Maybe you are looking for

  • Po Ref to PR for a particular material type

    Hi... I already did the configuration for the creation of PO with reference to PR, means without PR, PO will not be generated and throw the error message like, "You have no authorization to create without reference to another document". Now, I want t

  • Save as Excel file

    Hi! Why when I save a report as excel file the columns are united?? It not respect the layout of report?  Thank you Bye

  • Essbase installation issue -  ld skipping incompatable libiau.so

    I am currently working on installing essbase on a Dell 2950 running RHEL 5.5 Currently the installation is failing with the following errors: [2011-09-15T10:24:46.833-05:00] [OUI] [NOTIFICATION] [] [OUI] [tid: 20] [ecid: 0000J9hgA_lDg^sawhfd6G1ESXWo0

  • How to uninstall third-party visualizers on iTunes?

         To be specific I cannot uninstall the visualizer "Cubism". It required that I put a qtz file (Cubism) into a folder called "Compositions" (already provided for in the little window you open from the dmg) and then drag it into ~/Library/Compositi

  • Authorization Access to Infotypes

    Hi, I am trying to set up some authorisations which restrict access to certain infotypes. I have tried to stop a user changing IT9 but still able to view it   and I have made the below settings in the role using  P_ORGIN Authorization Level: R Infoty