BCC FlexUI big file import timeout.

Hi mates.
We are facing an issue with BCC when importing a big file with cross-shells. During the importing process for the file, the first screen is shown (Step 1 of 2) and, as the file takes long to be imported (2-3 minutes for a large number of assets), the second screen is never shown despite the process ends successfully. We are using Atg 10.0.3
Have you faced any similar problem? We have wrapped projectAssets.jsp with transaction marks, but it didn't work. Is there nay FlexUI time-out configuration we could set up to get the second screen even when the file takes long to import?
Thank you very much.
Regards.

Hi Joel.
Thank you very much for your prompt response. We have monitored the process since the BCC user clicks on the Import button to make sure we haven't got the problem you are telling me. The importing process is finishing successfully in the back end after a couple of minutes, transaction timeout is set up properly in Weblogic to allow long running transactions in customer's environment so, we are not having connection timeouts. The thing is the callback to the FlexUI is missing, probably because the interface is listening for a callback only during a period of time lower than the time the file needs to be imported, thus, the interface doesn't update the state and hangs in Step 1 despite the importation has been done and it should reach the Step 2. We opened an SR about this case (3-6766177811) and they recommend us to apply a change to projectAssets.jsp, wraping it into a transaction. We've tested the change and it doesn't work, we are still facing the same issue when importing big files.
I am completely new with Flex, so, I don't know very much how this behaves. Is there any parameter related to a listener timer or something similar that could be set up in FlexUI to get call backs from long-running transactions?
Thank you very much for your support.
Kind Regards.
Felix Rodriguez.

Similar Messages

  • Not enough space on my new SSD drive to import my data from time machine backup, how can I import my latest backup minus some big files?

    I just got a new 256GB SSD drive for my mac, I want to import my data from time machine backup, but its larger than 256GB since it used to be on my old optical drive. How can I import my latest backup keeping out some big files on the external drive?

    Hello Salemr,
    When you restore from a Time Machine back up, you can tell it to not transfer folders like Desktop, Documents. Downloads, Movies, Music, Pictures and Public. Take a look at the article below for the steps to restore from your back up.  
    Move your data to a new Mac
    http://support.apple.com/en-us/ht5872
    Regards,
    -Norm G. 

  • Is an 8.7GB file too big to import into iMovie 9?

    Is an 8.7GB file too big to import into iMovie 9?  I am using iMovie 8.0.6 on a Macbook Pro running OS 10.6.8 on 2.26 GHz Intel Core 2 Duo with 2 GB Memory.  I can't get my movie file to import into iMovie...any advice would be much appreciated!

    When you import into iMovie, your video is converted into DV format. This uses 13 GB per hour of video. So the first thing you have to do is make sure you have 26 GB free on your hard drive, plus another 10GB free for OSX to perform, plus 10GB free if you plan on burning a DVD.
    iMovie has non-destructive editing. If you have used any part of a clip and discard the rest, that entire clip is still saved in your project - unused but eating up disc space. If you use no part of a clip and discard the clip, then the clip can go in the trash and you get back disc space. So...
    If you import 2 hours of video (26GB) as one clip, and only use 2 minutes, your 2-minute segment retains its 2-hr source in memory. 2 min = 26 GB.
    One thing you can do is set the iMovie preferences to automatically create new clips every 3 minutes. (2 hours of 3-min clips = 40 clips.) This allows you to find the 2-minute segment you want and delete all of the unused clips. Say your 2-minute segment comes from 2 of these 3-min clips, so you keep 6 minutes of source footage and delete the rest. Clips that are entirely unused will be completely deleted and give you back disc space. At .65 GB per clip, your 6-minute project is now only 1.3 GB, and you've just regained 24.7 GB of disc space.

  • Moving big files(600MB) with FTP Adapter error The IO operation failed

    I everybody, I have the next trouble:
    I need to move big files from one server to another remote server through ftp protocol. All the configuration is correct and I am able to move little files
    with no problem, but when I move big files the server shows the next error:
    Exception occured when binding was invoked. Exception occured during invocation of JCA binding: "JCA Binding execute of Reference operation 'readEBS'
    failed due to: The IO operation failed. The IO operation failed. The "OPER[NOOP][NONE]" IO operation for "/tmp/TestLogSOA/DetalleCostos3333333.dvd"
    failed. ". The invoked JCA adapter raised a resource exception. Please examine the above error message carefully to determine a resolution.
    java.sql.SQLException: Unexpected exception while enlisting XAConnection java.sql.SQLException: XA error: XAResource.XAER_NOTA start() failed on
    resource 'SOADataSource_ohsdomain': XAER_NOTA : The XID is not valid oracle.jdbc.xa.OracleXAException at
    oracle.jdbc.xa.OracleXAResource.checkError(OracleXAResource.java:1532) at oracle.jdbc.xa.client.OracleXAResource.start(OracleXAResource.java:321) at
    weblogic.jdbc.wrapper.VendorXAResource.start(VendorXAResource.java:51) at weblogic.jdbc.jta.DataSource.start(DataSource.java:722) at
    weblogic.transaction.internal.XAServerResourceInfo.start(XAServerResourceInfo.java:1228) at
    weblogic.transaction.internal.XAServerResourceInfo.xaStart(XAServerResourceInfo.java:1161) at
    weblogic.transaction.internal.XAServerResourceInfo.enlist(XAServerResourceInfo.java:297) at
    weblogic.transaction.internal.ServerTransactionImpl.enlistResource(ServerTransactionImpl.java:507) at
    weblogic.transaction.internal.ServerTransactionImpl.enlistResource(ServerTransactionImpl.java:434) at
    weblogic.jdbc.jta.DataSource.enlist(DataSource.java:1592) at weblogic.jdbc.jta.DataSource.refreshXAConnAndEnlist(DataSource.java:1496) at
    weblogic.jdbc.jta.DataSource.getConnection(DataSource.java:439) at weblogic.jdbc.jta.DataSource.connect(DataSource.java:396) at
    weblogic.jdbc.common.internal.RmiDataSource.getConnection(RmiDataSource.java:355) at
    oracle.integration.platform.xml.XMLDocumentManagerImpl.getConnection(XMLDocumentManagerImpl.java:623) at
    oracle.integration.platform.xml.XMLDocumentManagerImpl.insertDocument(XMLDocumentManagerImpl.java:208) at
    sun.reflect.GeneratedMethodAccessor1534.invoke(Unknown Source) at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at
    org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307) at
    org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182) at
    org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149) at
    org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106) at
    org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at
    org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204) at $Proxy285.insertDocument(Unknown Source) at
    oracle.integration.platform.instance.store.MessageStore.savePayload(MessageStore.java:244) at
    oracle.integration.platform.instance.store.MessageStore.savePayloads(MessageStore.java:99) at
    oracle.integration.platform.instance.InstanceManagerImpl.persistPayloads(InstanceManagerImpl.java:773) at
    oracle.integration.platform.instance.InstanceManagerImpl.persistReferenceInstanceBean(InstanceManagerImpl.java:1106) at
    oracle.integration.platform.blocks.adapter.AbstractAdapterBindingComponent.createAndPersistBindingInstance(AbstractAdapterBindingComponent.java:502)
    at oracle.integration.platform.blocks.adapter.AdapterReference.createAndPersistBindingInstance(AdapterReference.java:356) at
    oracle.integration.platform.blocks.adapter.AdapterReference.request(AdapterReference.java:171) at
    oracle.integration.platform.blocks.mesh.SynchronousMessageHandler.doRequest(SynchronousMessageHandler.java:139) at
    oracle.integration.platform.blocks.mesh.MessageRouter.request(MessageRouter.java:179) at
    Thanks!!!

    Hi idavistro,
    You can try setting the XA Transaction Timeout for SOADataSource.
    1. Log in to the WebLogic Admin Console.
    2. Select in the left tree: Services-> Datasources-> SOADataSource->Transaction.
    2. Select Set XA Transaction Timeout.
    3. Set XA Transaction Timeout to 0.
    4. Restart the server and check if the error still appears.
    Regards,
    Neeraj Sehgal

  • Keeping "CS Web Service session" alive while uploading big files.

    Hi.
    I have a problem when I'm uploading big files, which takes longer than the session timeout value, causing the upload to fail.
    As you all know uploading a file is a three step process:
    1). Create a new DocumentDefinition Item on the server as a placeholder.
    2). Open an HTTP connection to the created placeholder and transfer the data using the HTTPConnection.put() method.
    3). Create the final document using the FileManager by passing in the destination folder and the document definition.
    The problem is that step 2 take so long that the "CS Web Service Session" times out and thus step 3 can not be completed. The Developer guide gives a utility method for creating an HTTP connection for step 2 and it states the folllowing "..you must create a cookie for the given domain and path in order to keep the session alive while transferring data." But this only keeps the session of the HTTP Connection alive and not the "CS Web Service Session". As in my case step 2 completes succesfully and the moment I peform step 3 it throws an ORACLE.FDK.SessionError:ORACLE.FDK.SessionNotConnected exception.
    How does one keep the "CS Web Service Session" alive?
    Thanks in advance
    Regards.

    Okay, even a thread that pushes dummy stuff through once in a while doesn't help. I'm getting the following when the keep alive thread kicks in while uploading a big file.
    "AxisFault
    faultCode: {http://xml.apache.org/axis/}HTTP
    faultSubcode:
    faultString: (409)Conflict
    faultActor:
    faultNode:
    faultDetail:
    {}:return code: 409
    <HTML><HEAD><TITLE>409 Conflict</TITLE></HEAD><BODY><H1>409 Conflict</H1>Concurrent Requests On The Same Session Not Supported</BODY></HTML>
    {http://xml.apache.org/axis/}HttpErrorCode:409
    (409)Conflict
         at org.apache.axis.transport.http.HTTPSender.readFromSocket(HTTPSender.java:732)
         at org.apache.axis.transport.http.HTTPSender.invoke(HTTPSender.java:143)
         at org.apache.axis.strategies.InvocationStrategy.visit(InvocationStrategy.java:32)
         at org.apache.axis.SimpleChain.doVisiting(SimpleChain.java:118)
         at org.apache.axis.SimpleChain.invoke(SimpleChain.java:83)
         at org.apache.axis.client.AxisClient.invoke(AxisClient.java:165)
         at org.apache.axis.client.Call.invokeEngine(Call.java:2765)
         at org.apache.axis.client.Call.invoke(Call.java:2748)
         at org.apache.axis.client.Call.invoke(Call.java:2424)
         at org.apache.axis.client.Call.invoke(Call.java:2347)
         at org.apache.axis.client.Call.invoke(Call.java:1804)
         at oracle.ifs.fdk.FileManagerSoapBindingStub.existsRelative(FileManagerSoapBindingStub.java:1138)"
    I don't understand this, as the exception talks about "Concurrent Requests On The Same Session", but if their is already a request going on why is the session timing out in the first place?!
    I must be doing something really stupid somewhere. Aia ajay jay what a unproductive day...
    Any help? It will be greatly appreciated...

  • How to parse a big file with Regex/Patternthan

    I would parse a big file by using matcher/pattern so i have thought to use a BufferedReader.
    The problem is that a BufferedReader constraints to read
    the file line by line and my patterns are not only inside a line but also at the end and at the beginning of each one.
    For example this class:
    import java.util.regex.*;
    import java.io.*;
    public class Reg2 {
      public static void main (String [] args) throws IOException {
        File in = new File(args[1]);
        BufferedReader get = new BufferedReader(new FileReader( in ));
        Pattern hunter = Pattern.compile(args[0]);
        String line;
        int lines = 0;
        int matches = 0;
        System.out.print("Looking for "+args[0]);
        System.out.println(" in "+args[1]);
        while ((line = get.readLine()) != null) {
          lines++;
          Matcher fit = hunter.matcher(line);
          //if (fit.matches()) {
          if (fit.find()) {
         System.out.println ("" + lines +": "+line);
         matches++;
        if (matches == 0) {
          System.out.println("No matches in "+lines+" lines");
      }used with the pattern "ERTA" and this file (genomic sequence) :
    AAAAAAAAAAAERTAAAAAAAAAERT [end of line]
    ABBBBBBBBBBBBBBBBBBBBBBERT [end of line]
    ACCCCCCCCCCCCCCCCCCCCCCERT [end of line]
    returns it has found the pattern only in this line
    "1: AAAAAAAAAAAERTAAAAAAAAAERT"
    while my pattern is present 4 times.
    Is really a good idea to use a BufferedReader ?
    Has someone an idea ?
    thanx
    Edited by: jfact on Dec 21, 2007 4:39 PM
    Edited by: jfact on Dec 21, 2007 4:43 PM

    Quick and dirty demo:
    import java.io.*;
    import java.util.regex.*;
    public class LineDemo {
        public static void main (String[] args) throws IOException {
            File in = new File("test.txt");
            BufferedReader get = new BufferedReader(new FileReader(in));
            int found = 0;
            String previous = "", next = "", lookingFor = "ERTA";
            Pattern p = Pattern.compile(lookingFor);
            while((next = get.readLine()) != null) {
                String toInspect = previous+next;
                Matcher m = p.matcher(toInspect);
                while(m.find()) found++;
                previous = next.substring(next.length()-lookingFor.length());
            System.out.println("Found '"+lookingFor+"' "+found+" times.");
    /* test.txt contains these four lines:
    AAAAAAAAAAAERTAAAAAAAAAERT
    ABBBBBBBBBBBBBBBBBBBBBBERT
    ACCCCCCCCCCCCCCCCCCCCCCERT
    ACCCCCCCCCCCCCCCCCCCCCCBBB
    */

  • Question about reading a very big file into a buffer.

    Hi, everyone!
    I want to randomly load several characters from
    GB2312 charset to form a string.
    I have two questions:
    1. Where can I find the charset table file? I have used
    google for hours to search but failed to find GB2312 charset
    file out.
    2. I think the charset table file is very big and I doubted
    whether I can loaded it into a String or StringBuffer? Anyone
    have some solutions? How to load a very big file and randomly
    select several characters from it?
    Have I made myself understood?
    Thanks in advance,
    George

    The following can give the correspondence between GB2312 encoded byte arrays and characters (in hexadecimal integer expression).
    import java.nio.charset.*;
    import java.io.*;
    public class GBs{
    static String convert() throws java.io.UnsupportedEncodingException{
    StringBuffer buffer = new StringBuffer();
    String l_separator = System.getProperty("line.separator");
    Charset chset = Charset.forName("EUC_CN");// GB2312 is an alias of this encoding
    CharsetEncoder encoder = chset.newEncoder();
    int[] indices = new int[Character.MAX_VALUE+1];
    for(int j=0;j<indices.length;j++){
           indices[j]=0;
    for(int j=0;j<=Character.MAX_VALUE;j++){
        if(encoder.canEncode((char)j)) indices[j]=1;
    byte[] encoded;
    String data;
    for(int j=0;j<indices.length;j++) {
         if(indices[j]==1) {
                encoded =(Character.toString((char)j)).getBytes("EUC_CN");
                          for(int q=0;q<encoded.length;q++){
                          buffer.append(Byte.toString(encoded[q]));
                          buffer.append(" ");
                buffer.append(": 0x");buffer.append(Integer.toHexString(j));
                buffer.append(l_separator);
        return buffer.toString();
    //the following is for testing
    /*public static void main(String[] args) throws java.lang.Exception{
       String str = GBs.convert();
       System.out.println(str);*/

  • Download big files

    I am trying to download a big file from the internet. Currently I am doing the following:
                URLConnection conn = downloadURL.openConnection();
                InputStream in = conn.getInputStream();
                OutputStream out = new FileOutputStream(dst);
                // Transfer bytes from in to out
                byte[] buf = new byte[SIZE];
                int len;
                while ((len = in.read(buf)) > 0 && !cancelled) {
                    out.write(buf, 0, len);
                }That code works for me most of the time, but sometimes the file is not downloaded correctly and I do not know how to test if the download was completed or not, and I do not know how to guarantee that the file is downloaded completely.
    Is there some way to make this?
    Greetings,
    Magus

    That statement makes no sense. I'm programming in Java, and Java has a fine mechanism for throwing exceptions.No, that statement makes no sense. Your Java code is talking TCP/IP to a server, which doesn't have such a mechanism. It can close or reset the connection; that's it. If it had reset the connection, your read() would block forever, or timeout. If it had closed the connection, your read() would have returned -1, as it did, so that is what happened.
    1. Can you expound upon what makes you think that is the reason that -1 was returned, as opposed to the connection timing out, the connection dropping, etc.?Because no exception was thrown. Or else one was thrown and you have swallowed it, but the code you posted doesn't indicate that.
    2. Can you indicate why Java would return a -1 instead of throwing a some I/O exception?Because the server, or possibly an intermediate firewall, closed the connection, rather than Java incurring some exception at the client end such as a timeout. The documentation you quoted at me before bears that out. That's what the -1 means. You quoted that at me yourself.
    The whole point here is that, according to the API documentation, -1 apparently indicates that the end of the stream was reached normally, and that error conditions are indicated by exception.Exactly so. So the server closed the connection normally, or an intermediate firewall did, but before it had sent all the data. Why is another question. Have a look at the server logs, or investigate the firewall configuration.
    I guess I'm asking if anyone has seen this behavior, and has any insight on why it doesn't seem to follow the API.It does.
    you seem to think that this behavior is in line with the API documentation; I disagree.Well you've quoted enough of it yourself: have another look, or another think. Premature (but normal) closing of the connection by the server or a firewall is the only possible explanation for what you're seeing. If it wasn't closed you wouldn't be getting the -1; if there was a timeout you would get a SocketTimeoutException; if there was any other exception you would catch that.
    I've seen plenty of short downloads in my life, but the server doesn't have a way of indicating that via TCP/IP. You have to count.
    NB some firewalls do close long-lived connections on their own account. Is there a client-side firewall that might be doing that.
    Or else the transmitted content-length is wrong; for example, it overflows an integer.

  • OVM Manager 3.1.1 CLI clone big files error

    Hello,
    We use ovm manager cli for backup purposses. During a clone operation of a big file after about 10 to 12 min. we get an error:
    Error Msg: com.oracle.odof.exception.PermissionException: Exchange is not connected
    But the clone operation is being continued successfully_.
    So, from client point of view operation failed, but for server succeded.
    Clone command is as follows:
    ssh admin@ovmm -p 10000 "clone VirtualDisk id=$VM_FILE_ID target=$VM_FILE_REPO_ID cloneType=Sparse"
    The whole message:
    Command: clone VirtualDisk id=0004fb00001200004d032c969c42095d.img target=0004fb000003000072340b1e2eb70904 cloneType=Sparse
    Status: Failure
    Time: 2013-01-18 14:42:58.955
    Error Msg: com.oracle.odof.exception.PermissionException: Exchange is not connected
    Fri Jan 18 14:42:58 CET 2013
    OVM> Failed to complete command(s), error happened. Connection closed.
    We blamed timeouts, but after setting higher values nothing has changed.
    Thank you in advance
    Gregory

    This really sounds like this nasty timeout issue, which had its first appearances in the early builds of OVM 3.0.3 and prevented e.g. the creation of rather large storage repositories over iSCSI when using a standard 1 GBit connection. The command would simply timeout on the OVMM after 120 secs, rather than waiting for the ovs-agent to report back…
    In OVM 3.1.1 (368) this timeout had been upped to 10 mins, but I knew that this only would last for months… and I told Oracle Support so, but eh… you know… ;)

  • Big file size

    Hi. I notice some of my files are 8 times bigger than others.
    Average size for some books is 250 kb. Similar files in other books is 2000plus.
    I import referenced graphics.
    Graphics sizes are tiny (16 kb, 8 kb, etc)
    Only 1 or 2 graphics per file.
    Number of pages similar (5-10)
    Any idea why some are so big and others arent?
    Any idea how to down size the big ones??
    Thanks

    Mike,
    The Save FrameImage option forces FM to always embed an uncompressed
    copy of any referenced image in FrameImage format. This is a sure fire
    way to bloat your file.
    Note, if you have any OLE objects, then they will also be stored
    internally in an uncompressed manner.
    The surest way is to save one of the small files and one of the big
    files as MIF and then use either a text editor or Graham Wideman's
    MifBrowser (see
    http://www.grahamwideman.com/gw/tech/framemaker/mifbrowse.htm tool to
    inspect for imported graphics to see if it's just a link to a file
    name or if some image or vector facets are actually embedded.

  • How much larger is an imovie file imported in FCPX

    just planing to upgrade to FCPX . Would like to know how much larger wil be an imovie event file when imported into FCPX ?
    and for also, for instance, if a camera .mts file is 10GB .. how big when imported into FCPX ?

    None of you using FCPX ???

  • Big table import trouble

    instead of 1 big export/import can I do several cumulative exps/imps i.e. can I append rows onto an existing data filled table
    Thanks
    MN

    MN,
    If you don't have any sorts of unique constraints or primary key issue you can definitley do that
    imp username/password file=mytableexport.dmp tables=table1 ignore=Y rows=Y constraints=N indexes=NRegards

  • Upload of very big files (300MB+)

    Hello all,
    I am trying to create application in HTMLDB to store files via webbrowser in DB(BLOB). I created all needed components and now stuck with the problem of uploading big files. Basicaly when file is over 100MB it becomes unreliable and for big files does not work at all. Can somebody help me to figure out how to upload big files into htmldb application. Any hints and sugestions are welcome. Examples will be even more appreciated.
    Sincerely,
    Ian

    Ian,
    When you say "big files does not work at all", what do you see in the browser? Is no page returned at all?
    When a file is uploaded, it takes some amount of time to simply transfer the file from the client to modplsql. If you're on a local Gbit network, this is probably fast. If you're doing this over a WAN or over the Internet, this is probably fairly slow. As modplsql gets this uploaded file, it writes it to a temporary BLOB. Once fully received, modplsql will then insert this into the HTML DB upload table.
    I suspect that the TimeOut directive in Apache/Oracle HTTP Server is kicking in here. The default setting for this is 300 (5 minutes).
    I believe the timeout is reset by modplsql during file transfer to avoid a timeout operation while data is still being sent. Hence, I believe the insertion of your large file into the file upload table is taking longer than the TimeOut directive.
    The easy answer is to consider increasing your TimeOut directive for Apache/Oracle HTTP Server.
    The not so easy answer is to investigate why it takes so long for this insert, and tune the database accordingly.
    Hope this helps.
    Joel

  • Photoshop CC slow in performance on big files

    Hello there!
    I've been using PS CS4 since release and upgraded to CS6 Master Collection last year.
    Since my OS broke down some weeks ago (RAM broke), i gave Photoshop CC a try. At the same time I moved in new rooms and couldnt get my hands on the DVD of my CS6 resting somewhere at home...
    So I tried CC.
    Right now im using it with some big files. Filesize is between 2GB and 7,5 GB max. (all PSB)
    Photoshop seem to run fast in the very beginning, but since a few days it's so unbelievable slow that I can't work properly.
    I wonder if it is caused by the growing files or some other issue with my machine.
    The files contain a large amount of layers and Masks, nearly 280 layers in the biggest file. (mostly with masks)
    The images are 50 x 70 cm big  @ 300dpi.
    When I try to make some brush-strokes on a layer-mask in the biggest file it takes 5-20 seconds for the brush to draw... I couldnt figure out why.
    And its not so much pretending on the brush-size as you may expect... even very small brushes (2-10 px) show this issue from time to time.
    Also switching on and off masks (gradient maps, selective color or leves) takes ages to be displayed, sometimes more than 3 or 4 seconds.
    The same with panning around in the picture, zooming in and out or moving layers.
    It's nearly impossible to work on these files in time.
    I've never seen this on CS6.
    Now I wonder if there's something wrong with PS or the OS. But: I've never been working with files this big before.
    In march I worked on some 5GB files with 150-200 layers in CS6, but it worked like a charm.
    SystemSpecs:
    I7 3930k (3,8 GHz)
    Asus P9X79 Deluxe
    64GB DDR3 1600Mhz Kingston HyperX
    GTX 570
    2x Corsair Force GT3 SSD
    Wacom Intous 5 m Touch (I have some issues with the touch from time to time)
    WIN 7 Ultimate 64
    all systemupdates
    newest drivers
    PS CC
    System and PS are running on the first SSD, scratch is on the second. Both are set to be used by PS.
    RAM is allocated by 79% to PS, cache is set to 5 or 6, protocol-objects are set to 70. I also tried different cache-sizes from 128k to 1024k, but it didn't help a lot.
    When I open the largest file, PS takes 20-23 GB of RAM.
    Any suggestions?
    best,
    moslye

    Is it just slow drawing, or is actual computation (image size, rotate, GBlur, etc.) also slow?
    If the slowdown is drawing, then the most likely culprit would be the video card driver. Update your driver from the GPU maker's website.
    If the computation slows down, then something is interfering with Photoshop. We've seen some third party plugins, and some antivirus software cause slowdowns over time.

  • My scanner isn't a choice in File Import in PSE 8 for Mac

    I just loaded PSE 8 in my new Macbook (Snow Leopard), but my Epson scanner doesn't appear in File>Import. I've followed all the instructions I can find including loading Rosetta and downloading the Twain drivers from the Epson site. Can anyone help?

    You will need to go to Epson's site to see what is available for your scanner model. If you find one, download it and follow the instructions for installing it.
    Sorry I can't be more specific, but I have a canon scanner (and it's so old there's never going to be an intel plug-in for it).

Maybe you are looking for