Mac faster handling large files than a PC?

Hei guys.
Don't want to start a OS wars discussion here.
But I just wonder...
I have a pretty fast quad core laptop with 8GB Ram and Windows 7 (64bit) and still when it comes to bigger files it doesnt
handle then much faster then before I think. (Or am I spoilt already?)
Does anybody has some experience with the latest Mac Book Pro and 8GB ram using Adobe CS and bigger files everyday
(files about the size of 500mb to 1gb in PS e.g.) in comparison of a similar Windows machine?
Thanks for you replies
Björn

Mac's have faster interfaces like Firefire 800 and Expressslots that allow eSATA devices for very fast transfer speeds.
Far as internal storage, SSD's and RAID setups are just the same as on PC's.
What it is most likely is that PC's are built a lot more cheaply on widespread scale, Mac's are used a lot in the video and creative fields where high output/input is reguired.
So by default a lot of Mac's come with faster interfaces, and by default a lot of PC's come with slow interfaces.
Also Windows PC's requires anti-malware running, constant defragging nad other things to keep the performance up which isn't too much the case with Mac's.
Someone who knows what they are doing can make either machine very fast

Similar Messages

  • Handling large files in scope of WSRP portlets

    Hi there,
    just wanted to ask if there are any best practices in respect to handling large file upload/download when using WSRP portlets (apart from by-passing WebCenter all-together for these use-cases, that is). We continue to get OutOfMemoryErrors and TimeoutExceptions as soon as the file being transfered becomes larger than a few hundred megabytes. The portlet is happily streaming the file as part of its javax.portlet.ResourceServingPortlet.serveResource(ResourceRequest, ResourceResponse) implementation, so the problem must somehow lie within WebCenter itself.
    Thanks in advance,
    Chris

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Handling Large Files in XI

    we  have designed couple of integration process for the project. we need to handle messages more than 5MB and the message rates are approximately 100 Msgs/Hr - and we did the tuning as per the tuning guide . we are now facing the following issues
    1. Jco Connection Failes Error Occurs whenever large size of files are handled by XI
    2. ICM_HTTP_INTERNAL ERROR Occurs whenever large size of files are handled by XI
    Does anyone has solution for these issues ?

    I am sure that you already have checked sizing requirements for large sized messages. If not, it might be worth looking at this:
    The memory consumption of XI depends on the number of processes running in parallel and the size of the message.
    In general, an extra sizing for XI memory consumption is not required. The total memory of the SAP Web Application Server should be sufficient except in the case of large messages (>1MB).
    To determine the memory consumption for processing large messages, you can use the following rules of thumb:
      Allocate 3 MB per process (for example, the number of parallel messages per second may be an indicator)
      Allocate 4 kB per 1kB of message size in the asynchronous case or 9 kB per 1kB message size in the synchronous case
      Example: asynchronous concurrent processing of 10 messages with a size of 1MB requires 70 MB of memory
    (3MB + 4 * 1MB) * 10 = 70 MB With mapping or content-based routing where an internal representation of the message payload may be necessary, the memory requirements can be much higher (possibly exceeding 20 kBytes per 1kByte
    message, depending on the type of mapping).
    The size of the largest message thus depends mainly on the size of the available main memory. On a normal 32Bit operating system, there is an upper boundary of approximately 1.5 to 2 GByte per process, limiting the respective largest message size.
    Hope it helps.
    Cheers, Sachin K

  • Handling large files with FTP in XI

    Hi All,
    I have a file scenario where i have to post the file with size of more than 500MB and with the fields upto 700 in each line.
    The other scenario worked fine if the file size is below 70MB and less number of fields.
    Could anyone help me in handling such scenario with large file size and without splitting the file.
    1) From your previous experience did you use any Tools to help with the development of the FTP interfaces?
    2) The client looked at ItemField, but is not willing to use it, due to the licensing costs. Did you use any self-made pre-processors?
    3) Please let me know the good and bad experiences you had when using the XI File Adapter?
    Thanks & Regards,
    Raghuram Vedagiri.

    500 MB is huge. XI will not be able to handle such a huge payload for sure.
    Are you using XI as a mere FTP or are you using Content Conversion with mapping etc?
    1. Either use a splitting logic to split the file outside XI ( using Scripts ) and then XI handles these file.
    2. or Quick Size your hardware ( Java Heap etc ) to make sure that XI can handle this file ( not recommended though). SAP recommends a size of 5 MBas the optimum size.
    Regards
    Bhavesh

  • Hardware Question – Handling large files in Photoshop

    I'm working with some big TIFF files (~1GB) for large-scale hi-res printing (60" x 90", 10718 x 14451), and my system is lagging hard like never before (Retina MacBook Pro 2012 2.6GHz i7 /8 GB RAM/ 512GB HD).
    So far I've tried:
    1) converting to .psd and .psb
    2) changing the scratch disk to an external Thunderbolt SSD
    3) allocating all available memory to the program within photoshop preferences
    4) closing all other applications
    In general I'm being told that I don't have enough RAM. So what are the minimum recommended system requirements to handle this file size more comfortably? Newest Retina Pro with 16GB RAM? Or switch to iMac w/ 32? Mac Pro?
    Thanks so much!

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Handling Large files in PI scenarios?

    Hello,
    We have lot of scenarios (almost 50) where we deal with file interfaces atleast in receiver or sender side. Some of them are just file transfers where we use AAE and some are where we have to do message mapping (sometimes very complex ones).
    the interfaces work perfectly fine will a normal file which dont have much records but recently we started testing big files with over 1000 records and its taking a lot of time to process. It is also causing other messages which gets lined up in the same queue to wait in the queue for the amount of time it takes for the first message to process.
    This must be a very practical scenario where PI has to process large files specially files coming from banks. What is the best way to handle its processing? Apart from having a better system hardware (we are currently in the test environment. Production environment will definetely be better) is there any technique which might help us improve the processing of large files without data loss and without interrupting other message?
    Thanks,
    Yash

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • How does Time Machine handle large files?

    I'm relatively new at the whole Time Capsule / Time Machine process and have learned that large files (eg aperture library) are backed up each time there is a change and this can lead to the TC filling up quicker than normal.
    How does this work with daily and weekly backups?
    For example, if my aperture library is, say 1Gb and I import a load of photos from my camera and this goes up to 2Gb. I've learned that I should disable time machine while I'm in Aperture (or at least before 10.6...not sure now). So given I've done that, imported the files to Aperture but want to edit them later and ultimately move them into iPhoto to keep the Aperture album small.
    When I turn back on Time Machine, the next hourly backup will know the library has changed and will back it up, this will go on until a day backup has been taken - this deletes the 24 hourly backups? or does it merge them?
    If I then do the editing the following week, then export the photos and the library is now back to 1Gb again....backed up hourly/daily/weekly etc what am I left with??
    Do I have an original, the 2GB version and the new 1Gb version...ie 4Gb......is there a cunning way I can work to change the files within a week so only one of the changes is in the backup?

    Orpheus999 wrote:
    When I turn back on Time Machine, the next hourly backup will know the library has changed and will back it up, this will go on until a day backup has been taken - this deletes the 24 hourly backups? or does it merge them?
    The Time Machine panel of System Preferences says this:
    Time Machine keeps
    - Hourly backup for the past 24 hours
    - Daily backups for the past month
    - Weekly backups until your backup disk is full
    Each time Time Machine runs it creates what appears to be an entirely new backup set, although it does this in a way that doesn't require it to copy files that have already been copied. So merging isn't necessary. Another effect of how it operates is that each unique version of a file (as opposed to packages of files) only exists on the backup volume once.
    According to the contents of my Time Machine backup file, hourly backups are literally kept for 24 hours, not until the next "daily" backup. For a "daily" backup, it seems to keep the oldest "hourly" backup for a day.
    If I then do the editing the following week, then export the photos and the library is now back to 1Gb again....backed up hourly/daily/weekly etc what am I left with??
    Do I have an original, the 2GB version and the new 1Gb version...ie 4Gb......is there a cunning way I can work to change the files within a week so only one of the changes is in the backup?
    You might be able to exclude those files from being backed up at certain times, but I can't be sure this would result in older copied of those files being retained.

  • Java proxies for handling large files

    Dear all,
    Kindly let me know handle the same in step by step explanation as i do not know much about java.
    what is advantage of using the java proxies here.Do we implement the split logic in java code for mandling 600mb file?
    please mail me the same to [email protected]

    Hi !!   Srinivas
    Check out this blog....for   Large file handling issue  
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    This will help you
    Please see the documents below. This might help you
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    /people/prasad.ulagappan2/blog/2005/06/27/asynchronous-inbound-java-proxy
    /people/rashmi.ramalingam2/blog/2005/06/25/an-illustration-of-java-server-proxy
    We can also find them on your XI/PI server in folders:
    aii_proxy_xirt.jar
    j2eeclusterserver0 inextcom.sap.aii.proxy.xiruntime
    aii_msg_runtime.jar
    j2eeclusterserver0 inextcom.sap.aii.messaging.runtime
    aii_utilxi_misc.jar
    j2eeclusterserver0 inextcom.sap.xi.util.misc
    guidgenerator.jar
    j2eeclusterserver0 inextcom.sap.guid
    Java Proxy
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    Pls reward if useful

  • Illustrator CC 2014 crashing Mac when saving large file

    Hello,
    I hope someone can help, when I am working on a large file in Illustrator CC 2014 and working from with in my mac - not a network drive, my Mac resets its self and I get the power symbol, grey screen and text.
    Does anyone know why this happens? I'm sorry if I have posted it in the wrong community. Below is my Mac spec:
    Processor: 3.4ghz intel core i7
    Memory: 16GB 1600 MHz RAM
    OSX Version: 10.9.1
    Thank you,
    Jade

    Sure Jade,
    Definitely I will. Working with Adobe and Apple to get some conclusion.
    Thanks
    James

  • Handling Large File

    Hi all,
    We need to handle a large file 88omb .is there any provision at the adapter level to break the file in to smaller chunks.
    Can we avoid using the shell scripts ansd OS level commands.
    Thanks,
    Srinivas

    Hi Srinivas,
       if it is a text file then you could break up the file into multiple recordsets,
      e.g.,
    [Converting Text Format in the Sender File/FTP Adapter to XML   |http://help.sap.com/saphelp_nwpi711/helpdata/en/44/658ac3344a4de0e10000000a1553f7/frameset.htm]
    and
    [#821267 File Adapter FAQ|http://service.sap.com/sap/support/notes/821267]
    14. Memory Requirements
    Regards
      Kenny

  • Handling large files

    Hi,
    how to process large files in SAP XI?
    Thanks,
    Manogna

    hi seshagiri
    check the below blogs
    huge processing of files                                        
    Night Mare-Processing huge files in SAP XI                                        
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging                                        
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging                                        
    SAP XI acting as a (huge) file mover                                        
    The specified item was not found.                                        
    Managing bulky flat messages with SAP XI (tunneling once again) - UPDATED                                        
    The specified item was not found.
    regards
    kummari
    Edited by: Craig Cmehil on Jul 9, 2008 8:58 AM

  • Handling large files for displaying within a JTextPane

    I have some (very) large trace files sometimes up to more than 100MB and I want to group the traces which are originally stored unordered into this trace file.
    When I tried to read in a just 16MB large trace file and used this command line option: -Xm60m I get an OutOfMemoryError.
    Is there a way to handle (very) large (text) files most effectively? For instance, how can I achieve to group (some kind of sortation) a trace file of let's say 100MB most effectively? Maybe by splitting the file?
    I want to display the text of this/these text files in a JTextPane coloring some particular parts of the traces.
    Thank you in advance,
    Dirk
    Berlin, Germany
    [email protected]

    I don't know of any way to display 100MB of text effectively. I'm sure the user isn't going to sit there and scroll through 100MB of data to find what they are looking for. I would probably load the data into a database and create some queries the user can invoke to filter the data.

  • Media Encoder CC 2014 Exports larger file than Media Encoder CC with same settings

    We like to deliver final exports to most of our clients with the following settings:
    Format: Quicktime
    Codec: H.264
    Quality: 100
    1920x1080, 23.976, progressive, square pixels
    use maximum render quality
    audio: uncompressed, 48khz, stereo, 16 bit, single track
    When I used to export a ~5 minute video using these settings with Premiere or Media Encoder CC (7.2.2.29), I would get a file ~1GB. If I export a ~5 minute video using these settings in Premiere or Media Encoder CC 2014 (8.0.1.48), the result is ~ 4GB.
    Any thoughts on why this might be? I assume it's a user error but I can't nail down what it is.
    Thanks for your help!

    Hi matthewrichie88,
    I used the same settings that you have mentioned above on a Windows 7 computer and got the same file size with both versions of Media Encoder. Please specify if you have a Mac or a Windows and which OS exactly? It would really help if you can post screenshots of your export settings with both versions. Give more details about the source files as well.
    Thanks!
    Rameez

  • Best way to handle large files in FCE HD and iDVD.

    Hi everyone,
    I have just finished working on a holiday movie that my octagenarian parents took. They presented me with about 100 minutes of raw footage that I have managed to edit down to 64 minutes. They have viewed the final version that I recorded back to tape for them. They now want to know if I can put it onto a DVD for them as well. Problem is the FCE HD file is 13Gb.
    So here is my question.
    What is the best way to handle this problem?
    I have spoken to a friend of mine who is a professional editor. She said reduce the movie duration down to about 15mins because it's probably too long and boring. (rather hurtful really) Anyway that is out of the question as far as my oldies are concerned.
    I have seen info on Toast 8 that mentions a "Fit to DVD" process that purports to "squash" 9Gb of movie to a 4.7Gb disk. I can't find if it will also put 13Gb onto a dual layer 8.5Gb disk.
    Do I have to split the movie into two parts and make two dual layer DVD's? If so I have to ask - How come "Titanic", 3hrs+ fits on one disk??
    Have I asked too many questions?

    Take a deep breath. Relax. All is fine.
    iDVD does not look at the size of your video file, it looks at the length. iDVD can accomodate up to 2 hours of movie
    iDVD gives you different options depending on the length of your movie. Although I won't agree with your friend about reducing the length of your movie to 15 minutes, if you could trim out a few minutes to get it under an hour that setting in iDVD (Best Performance though the new version may have renamed it) gives you the best quality. Still, any iDVD setting will give you good quality even at 64 minutes
    In FCE export as Quicktime Movie NOT any flavour of Quicktime Conversion. Select chapter markers if you have them. If everything is on one system unchecked the Make Movie Self Contained button. Drop the QT file into iDVD

  • Generate PDF from App. Srv. Produces Larger File than from Reports Builder

    Good Day,
    I have a the below issue , and need your help please.
    When generating a PDF report from the application server, the PDF size is large (200 MB) while
    same report is generated from the builder with smaller size 10MB.
    - Report Builder 9.0.2.0.3
    - The report contain only TEXT no Image , and contain arround 4000 pages ...
    The steps done from Application server in command line :
    - First I tested with several values for the parameter OUTPUTIMAGEFORMAT : file size stays the same or increases.
    - Second I tested with parameter PDFCOMP (for PDF compression) with varying values from 0 to 9 : file size stays the same.
    Kindly help ...

    From report builder :
    I open only the .rdf and generate the output to a PDF file.
    The number of page is from the PDf generated from Report builder , for the other one I Cant open it with PDF reader...
    Thx.
    MAN

Maybe you are looking for