Excessive pagefile size

With an uptime of just 2 days (from the "uptime" unix command), my virtual memory size is 14.21GB. I have four applications loaded aside from Finder - Dashboard, Safari, Photoshop, and a small HTML editor.
Worryingly, my ratio of pageins to pageouts is 425,000 to 325,000. I realise this may mean I need more RAM, but my question is, why the high usage after two days?
Let's go through Activity Monitor. Each widget in my Dashboard (of which I have 10) is using ~350 MB. In fact, all are in the 300s. Dock and Finder are each using ~350MB. iCalAlarmScheduler is using ~350 MB! Wow. iTunes helper is using... ~350MB. You get the point. Everything is using around ~350 MB, aside from Photoshop CS2 and Safari, which are using 1.42GB and 1.23GB respectively. I have a couple of small (200x200 pixel) images open in Photoshop, and four small webpages open in Safari.
I suspect that the apps may have been allocated large amounts of virtual memory by the OS but are not actually using it. Photoshop and Safari may be stealing all my physical RAM and causing the high number of pageouts. Could this be the case?

Some Dashboard Widgets have been reported to have memory leaks.
Check Problems from insufficient RAM and free hard disk space.

Similar Messages

  • Quicktime Pro - get rid of excess file size

    Hi,
    I have FCS2 on some computers, but not on others (licensing) so I want to use Qt Pro to do simple editing on those.. mainly to cut off extra blank footage when I capture too much from an analog source.
    I know how to edit the footage in QT Pro, but even though the finder shows the new duration, and if I import it into FCP it shows the new duration, the finder shows the file size as the original huge size..
    Eg. I capture a 30 min tape, but let the capture run for 2 hrs say.. So the DV codec file I have is 25 Gb or so.. I edit it down to the 30 mins, and it shows in Finder as having this new 30 min duration, but the Finder also says the 30 min file is still 25GB.
    I understand non-destructive editing in FCP, and one can use Media Management to get rid of the excess footage.. (to kinda make it destructive..?)
    Is Quicktime Pro a non-destructive editor only? I'd love it if someone could tell em that it's not... that with a certain quick action or two I could scuttle the excess footage without having to go through an export scenario..
    Pete

    Is Quicktime Pro a non-destructive editor only?
    No, not as long as you trim the content to your selection and then saved the result as a separate, independent file. (Most use the "standalone" option here and either archive or delete the original content.)
    I know how to edit the footage in QT Pro, but even though the finder shows the new duration, and if I import it into FCP it shows the new duration, the finder shows the file size as the original huge size..
    Just ran a quick test on my system (G5 2.0 PPC under Leopard 10.5.8 using QT Pro 7.6.4 and FCP v6.0.6) with the result that the Finder Info, QT 7 Inspector, and FCP Properties windows all agreed that the trimmed content file was indeed smaller (in my case about half the size) than my original file.

  • Set pagefile size to 1.5x RAM

    Hi guys,
    Could not find another forum for this. Trying to automate Server 2012 R2 deployments, in our normal checklist for deploying an OS, someone has to go in and set the pagefile to 1.5x RAM on server. Majority of our servers have 4GB, but can vary +/- a few
    GBs. What I am looking for is something I can have run from Altiris to query amount of RAM installed and then set pagefile initial & max to 1.5x RAM installed. 
    Any help is appreciated. Thanks.

    JRV,
    Everything I've found on TechNet says the same that 2008 and above has improved very much on the way it automatically handles the pagefile. I have conveyed that to our server manager, but he still thinks that we should be setting a static size. I'm still
    trying to sway him. If we have a large static pagefile, we usually do move those to another drive. 90% of our data center is virtual. But still looking for a way to automate it in the event he overthrows my recommendation and insists that we set at 1.5x. 

  • Excessive LOBSEGMENT size and number of extents

    I am running a procedure to load a large number of records ( >1million). The records that I am loading has only a few columns including an XMLTYPE column containing the data from an XML file of size 10k. After reaching the 1 million mark, the size of the LOBSEGMENT associated with the XMLTYPE column is nearing 20GB and has nearly 600 extents! I understand that "auto" segment mgmt is the cause for the number of extents, but why is the LOBSEGMENT SO LARGE???
    Thanks in advance for any suggestions or ideas...
    Below are the tablespace settings:
    BLOCK_SIZE : 8192
    INITIAL_EXTENT : 65536
    NEXT_EXTENT :
    MIN_EXTENTS : 1
    MAX_EXTENTS : 2147483645
    PCT_INCREASE :
    MIN_EXTLEN : 65536
    STATUS : ONLINE
    CONTENTS : PERMANENT
    LOGGING : NOLOGGING
    FORCE_LOGGING : NO
    EXTENT_MANAGEMENT : LOCAL
    ALLOCATION_TYPE : SYSTEM
    PLUGGED_IN : NO
    SEGMENT_SPACE_MANAGEMENT : AUTO
    DEF_TAB_COMPRESSION : DISABLED
    RETENTION : NOT APPLY
    BIGFILE : NO

    Yay, we're almost done :)
    TEST.SQL>create table yjam.tablename (a number);
    Table created.
    TEST.SQL>create table cds.tablename (a number);
    Table created.
    TEST.SQL>SELECT EXTENTS, (bytes/1024/1024) MB FROM user_segments u
      2  WHERE segment_type = 'TABLE'
      3  AND u.segment_name = 'TABLENAME'
      4  /
    no rows selected
    TEST.SQL>l1
      1* SELECT EXTENTS, (bytes/1024/1024) MB FROM user_segments u
    TEST.SQL>c:user:dba
      1* SELECT EXTENTS, (bytes/1024/1024) MB FROM dba_segments u
    TEST.SQL>/
       EXTENTS         MB
             1         ,5
             1          5Yoann, in a kidding mood.

  • Manage max table storage space in case of excess data (size in GB)

    My scenario is that I am using sql server 2008 r2 on my end. I have created a database named testDB. I have a lot of tables with some log tables in this. some tables have contain lack of records in log table.
    So my purpose is that I want to fix the table size of those tables(log tables) and want to move records in other database table placed on another location. So my database has no problem.
    Please tell me, is there any way to make such above steps which I want for my database?
    Is there already built any such functionality in sql server?
    May be this question repeated but still I have no solution for my issue.
    Fill free to ask any query.
    Thanks

    My scenario is that I am using sql server 2008 r2 on my end. I have created a database named testDB. I have a lot of tables with some log tables in this. some tables have contain lack of records in log table.
    So my purpose is that I want to fix the table size of those tables(log tables) and want to move records in other database table placed on another location. So my database has no problem.
    Please tell me, is there any way to make such above steps which I want for my database?
    Is there already built any such functionality in sql server?
    May be this question repeated but still I have no solution for my issue.
    Fill free to ask any query.
    Thanks
    well, there is no such direct option to restrict the table size.. one way, you can do it. putt that table on separate filegroup and files and restrict the growth on the file. BUT, this will not give the accurate limitaion on the rows you want and, it not
    a good practice to do that- infact you should never do this option. and if you have more tables, each one would require it's own filegroup/files - this is a bad idea.
    the more common solution is to archive the information into another table in a different database.
    a simple script such as this would work. this will move all the log data older than 30 days to archive database
    use ArchiveDatabase
    GO
    insert into archivetable
    select * from testdb..Oldtablelog where logdate <(dateadd(day,-30,getdate())
    is there any particular reason you want to archive the data.. if this for database managability- for backups/maintanence - you can partition and mark the old filegroups as readonly and new data as read-write.
    Hope it Helps!!

  • Excessive thumbnail size

    Hi
    for an old-style photo album needed a bunch of 100px thumbnails. So asked LR3 to generate them for me. The result I got were files around 15 to 20 KB and not very nice looking either. So thought let's try the feature to limit size on export and started with 10 KB.  Interesting response I received was that a good deal (more than half the pictures) could not be generated.
    To pulled out good old Frontpage (or Sharepoint Designer) and gave it the original pictures. The resulting thumbnails were not only notably smaller (3 to 5 KB) but also looked nicer - not to mention the border effect that comes out nicely here.
    Wonder what is the trick here to get to smaller size thumbnails.
    BTW: pulling JPG quality down to 60 makes them really poor quality; so I'd rule out that option
    Thx & Best Regards
    Hubert

    johnrellis wrote:
    At quality 90, they ranged in size from 6 to 9 KB, and at quality 75, from 4 to 6 KB.  I couldn't casually tell the difference between the two qualities.
    Try this link: http://regex.info/blog/lightroom-goodies/jpeg-quality
    The author of this is also the author of the plugin mentioned else-thread.

  • Excessive backup size

    Can anyone explain why my hourly backups are so large. I do not have any "large file apps" I can think of eg Entourage, yet my system seems to want to backup 650 odd Mb every hour
    Aug 22 13:10:15 akmac com.apple.backupd[6423]: Backing up to: /Volumes/Time Machine Backups/Backups.backupdb
    Aug 22 13:10:31 akmac com.apple.backupd[6423]: 662.9 MB required (including padding), 842.77 GB available
    Aug 22 13:10:43 akmac com.apple.backupd[6423]: Copied 2045 files (1.2 MB) from volume Macintosh HD.
    Aug 22 13:10:43 akmac com.apple.backupd[6423]: 661.5 MB required (including padding), 842.77 GB available
    Aug 22 13:10:46 akmac com.apple.backupd[6423]: Copied 116 files (93 bytes) from volume Macintosh HD.
    Aug 22 13:10:48 akmac com.apple.backupd[6423]: Starting post-backup thinning
    Any suggestions appreciated. Thanks

    Allan
    Thanks for the reply.
    I do not have an Entourage database or other large file, and that is why I am puzzled.
    Looking at the log again I wonder if I am misreading it.
    Is the size of the backup  only 1.2Mb plus 93 bytes? See below:
    Aug 22 13:10:43 akmac com.apple.backupd[6423]: Copied 2045 files (1.2 MB) from volume Macintosh HD.
    Aug 22 13:10:46 akmac com.apple.backupd[6423]: Copied 116 files (93 bytes) from volume Macintosh HD.
    I wonder is the 662.9Mb required statement is refering to something else?

  • Having a problem with Excessive "modified" memory usage in Win7 x64, upwards of 3.6GB, any suggestions?

    I have 6GB of ram, a fresh install of Windows 7 x64, and the screenshot shows what happens after leaving my PC on for a couple days. (3782+MB being used by modified memory ATM).
    http://wow.deconstruct.me/images/ExcessiveMemory.jpg
    Any ideas on this?
    Edit:
    Added this after first round of suggestions
    http://wow.deconstruct.me/images/NotSoExcessiveMemory.jpg
    This is uptime of around 2 hours.
    The first image is of uptime of around 3-5 days.

    Matthew,
    The only reason why these pages are kept on the modified list indefinitely is because the system doesn't have any available pagefile space left. If you increase the size of the pagefile the system will write most of these pages to disk and then move them from the modified list to the standby list. Standby pages are considered part of "available memory", because they can be reused for some other purpose if necessary.
    Whether this would "fix" the problem or not depends on what the actual problem is. If it's an unbound memory leak then increasing the size of the pagefile will simply allow the system to run longer before it eventually hits the maximum pagefile size limit, or runs out of disk space. On the other hand, if it's a case of some application allocating a lot of memory and not using it for a long time, then increasing the pagefile might be a perfectly valid solution.
    Allowing the system to manage the size of the pagefile actually works well in most cases. Pagefile fragmentation (at the filesystem level) can only occur when the initially chosen size is not large enough and the system has to extend it at run time. For win7 we have telemetry data that shows that even for systems with 1 GB of RAM, less than 0.1% of all boot sessions end up having to extend the pagefile, and this number is even lower for larger amounts of RAM. If you think you are in that 0.1% and your pagefile might be getting fragmented, you can manually increase its minimum size such that the total system commit charge stays below 80% even if you run all your apps at once (80% is the threshold at which the pagefile is automatically extended). This will make sure the pagefile is created once and then stays at the same size forever, so it can't fragment. The maximum size can either be set to the same value as the minimum, or you can make it larger so that the system is more resilient to memory leaks or unexpectedly high loads.
    By the way, Windows doesn't use pagefiles as "extra memory", it uses them as a backing store for private pages, just like regular files are used as a backing store for EXEs/DLLs and memory mapped files. So if the system really has more than enough RAM (like in your second screenshot, where you have 3.6 GB of free pages) you shouldn't see any reads from the pagefile. You can verify this by going to the Disk tab in the resource monitor and looking for any disk IO from pagefile.sys. On smaller systems that don't have an excess of free pages you may see periodic reads from the pagefile, and this is expected because the total amount of data referenced by the OS/drivers/processes is larger than the total RAM. Forcefully keeping all pagefile-backed pages in memory (which is what disabling the pagefile does) would simply mean some other pages (memory mapped files, DLL code or data etc) would have to be paged out.
    Regarding further troubleshooting steps: If the system runs fine with a larger pagefile (commit charge stabilizes well below 80%, and you no longer see gigabytes of modified pages accumulating in memory) then you don't really need to do anything. If the problem persists, you can check for any processes with an abnormally high commit charge, and also check kernel memory usage in task manager. If it's a kernel leak you can usually narrow it down to a particular driver using poolmon.exe or kernel debugger.

  • Pagefile for SAP Erp2005 and Solution Manager 4.0

    Hi all,
    I've installed SAP ERP2005 and Solution Manager 4.0 on the same server with Windows 2003 Standard 32bit.
    The RAM on this server is 3325MB.
    How many paging file I must configure???
    At the moment I configured the pagefile size as defined in the installation guide:
    1 RAM + 8GB that is 12285MB.
    Is it correct????
    For both instance I set the PHYS_MEM parameter to 2328 (70% of the RAM)...is it correct??
    P.S. How to calculate the [BE] parameters??
    [BE] = maximum possible number of user calculated from the size of physical main memory
    Thanks in advanced
    Moreno

    I agree with Yaroslav,
    you should NOT set those values:
    em/initial_size_MB  |----    [PM]  |----    [PM]   |Mbyte     |
    |---    em/max_size_MB  |---    20000  |---   100000   |Mbyte     |
    |  em/address_space_MB  |-----    512  |-----   4096   |Mbyte     |
    |----  ztta/roll_first  |-------    1  |-------    1   |Byte      |
    |---    ztta/roll_area  |---  2000000  |---  3000000   |Byte      |
    |  ztta/roll_extension  |  2000000000  |  2000000000   |Byte      |
    |    abap/heap_area_dia  |  2000000000  |           2000000000   |Byte     |
    | abap/heap_area_nondia  |  2000000000  |           0  |Byte      |
    |  abap/heap_area_total  |  2000000000  |[PM]*1048576   |Byte      |
    |-    rdisp/ROLL_MAXFS  |  [BE] * 100   |  [BE] * 100   |8KB Block |
    |---    rdisp/ROLL_SHM  |  [BE] * 100   |  [BE] * 100   |8KB Block |
    |---    rdisp/PG_MAXFS  |---    32768  |---    32768   |8KB Block |
    |-----    rdisp/PG_SHM  | 
    so... that [BE] value is not to be used.

  • Pagefile gets deleted after reboot

    I have moved the pagefile to another partition and left a small piece on the system drive:
    C: (Windows) 200-500 MB
    B: (Temp) 4000-16000 MB
    When I set the pagefile size, the pagefile gets created immediately, but after reboot it gets deleted again and remains only on the system drive. So after reboot, there is only 200 MB pagefile, not 4200 MB. And the total available memory is only a little
    bigger than the amount of RAM, not 16 GB as it should be.
    What is wrong? Am I missing something?
    I have Win7 Pro SP1 x64, and a lot of free space on B:. Both partitions are on the same SSD disk.

    Hi Xars
    Thanks for your digging and we found similar cases with yours,so I tested on my own machine and turns out it is actually related to the Drive letter. Since The letters A and B are usually reserved for floppy disk drives, it's not recommended to
    assign A and B to anything other than a floppy or floppy substitute even you do not have a floppy disk drive installed. I suppose Windows might not let you if you do this thinking "putting a pagefile on a floppy disk!"
    We found sort of workaround for you but it still takes a lot of works, and it even need to reformat you hard disk.
    Therefore, we recommend changing driving letter and we think it is the best way to avoid this kind of issue.
    About Change, add, or remove a drive letter(Windows vista and 7 )
    http://windows.microsoft.com/en-us/windows/change-add-remove-drive-letter#1TC=windows-vista
    You could refer to similar case “Pagefile on a volume that uses "A" or "B" as the drive letter”
    http://answers.microsoft.com/en-us/windows/forum/windows_7-performance/pagefile-on-a-volume-that-uses-a-or-b-as-the-drive/fdac6f23-a34d-e011-8dfc-68b599b31bf5
    Regards

  • Computer slows or freezes and displays high memory usage message firefox + thunderbird

    When surfing between websites, the computer screen slows or freezes. I hit the return key and the new page can take 2-3 minutes to come up. The screen shows a message that there is high memory use on firefox, and also thunderbird when loading new incoming emails. The internet files are on C so here is a defrag report from 6.7.2011. I hope this helps your diagnosis.
    Volume (C:)
    Volume size = 29.29 GB
    Cluster size = 4 KB
    Used space = 7.73 GB
    Free space = 21.57 GB
    Percent free space = 73 %
    Volume fragmentation
    Total fragmentation = 13 %
    File fragmentation = 27 %
    Free space fragmentation = 0 %
    File fragmentation
    Total files = 31,551
    Average file size = 364 KB
    Total fragmented files = 4
    Total excess fragments = 21,222
    Average fragments per file = 1.67
    Pagefile fragmentation
    Pagefile size = 395 MB
    Total fragments = 2
    Folder fragmentation
    Total folders = 3,093
    Fragmented folders = 1
    Excess folder fragments = 0
    Master File Table (MFT) fragmentation
    Total MFT size = 48 MB
    MFT record count = 35,026
    Percent MFT in use = 71 %
    Total MFT fragments = 3
    Fragments File Size Files that cannot be defragmented
    7,684 804 MB \Documents and Settings\cameron\Application Data\Thunderbird\Profiles\duo7ne8y.default\Mail\mailbox.aon.at\Inbox
    13,535 958 MB \Documents and Settings\cameron\Application Data\Thunderbird\Profiles\duo7ne8y.default\Mail\mailbox.aon.at\Trash

    @ kmrupam: 3.40Ghz
    @ cor-el disabling hardware acceleration didn't help unfortunately.
    Thanks
    Simon

  • Adobe Photoshop CS2 has encountered a problem and needs to close.

    Just starting to get this error on open in CS2 after a few months of smooth sailing on a new system (below). CS2 now cannot open. It started occasionally and increased. Tried reinstalling. Tried deleting the PS config files and same error. Any suggestions?
    Here's a sysinfo paste of my system.
    Nick
    XP pro SP2
    Small Bus. Server network node
    Profile- Username.DOMAIN
    PHOTOSHOP
    Photoshop CS2 (upgrade from ver. 6)
    C:\Program Files\Adobe\Adobe Photoshop CS2\9.0
    MB - ASUS P5K-E
    Disk C: 819 GB Available, 931 GB Total, 819 GB Free
    Disk D: 96 GB Available, 232 GB Total, 96 GB Free
    Intel ICH8R/ICH9R SATA RAID Controller
    Physical Memory 2048 MB Total, 1149 MB Free
    Virtual Memory 4740 MB Total, 3900 MB Free
    PageFile Name \??\C:\pagefile.sys
    PageFile Size 800 MB
    PageFile Name \??\D:\pagefile.sys
    PageFile Size 2047 MB
    One Physical Processor / 2 Cores / 2 Logical Processors / 64 bits
    Intel Core2 Duo CPU E8400 @ 3.00GHz
    Video1: ATI Radeon HD 3800 Series
    Video2: ATI Radeon HD 2400 PRO
    Memory Capacity 2048 MBytes
    Slot 1
    Corsair
    CM2X1024-8500C5D
    1024 MBytes
    DDR2-800 (400 MHz) running 1066
    64 bits
    Voltage SSTL 1.8V
    Slot 3
    Corsair
    CM2X1024-8500C5D
    1024 MBytes
    DDR2-800 (400 MHz) running 1066
    64 bits
    Voltage SSTL 1.8V

    Just ran memtest and received no errors. This more or less clears the hardware. I have no idea how to proceed. Does adobe offer any direct support for this sort of problem?

  • Windows 7 install and/or environment freezing on new 13" MacBook Pro

    Hello! Long post, trying to give as much detail as I can... thank you in advance for your help on this!!
    I think I posted in the incorrect forum earlier -- reposting here where it's more relevant.
    I have a frustrating issue I've been battling all weekend. I can't seem to figure it out.
    I'm trying to set up dual boot (Boot Camp) on my new 13" MacBook Pro, installing Windows 7. I've tried x64 (preferable) and x86. This is the mid-April 2010 13" MacBook Pro. C2D 2.4GHz, 4GB RAM, 250GB HD, nVidia 320M.
    The problem I'm having is the machine is hard-locking. Sometimes it locks up during install (the 2nd phase, after it restarts and is "Completing Installation" (this step: http://spsexton.files.wordpress.com/2008/11/install-13.jpg). This makes me have to restart the installation completely, which is frustrating. Sometimes it locks up at the login screen. Sometimes it locks up once I'm logged in to Windows.
    I have noticed that if the installation completes and I can get to the desktop, I can generally use it OK until I install the Boot Camp Drivers (3.1, from the 10.6.3 install DVD). I had a hunch and had also read that it may have to do with the nVidia drivers it installs, so I have tried uninstalling those prior to restart. It locks up anyway at restart (during login). Note that this generally is the case -- sometimes it locks up before I even get to the Boot Camp Drivers step. I sometimes can then get in to Safe Mode without it locking (sometimes it does, sometimes it doesn't), and I think I had some repeated errors in the Event Viewer that appeared erroneous (certainly not errors that should cause a hard lock) -- I can post these if needed when I get home from work this evening.
    I was having consistent lockups during installation (at the "Completing Installation" step I mentioned above) when I'd let the Boot Camp Assistant set up the partition for me. I thought BCA might be doing something funny, so I set up the partition myself using Disk Utility then installed, and the install went in OK, however I still had it lock up after installing the Boot Camp Drivers.
    Here's the steps I've been following:
    1. Partition drive, either using Boot Camp Assistant or Disk Utility. Format to FAT if using Disk Utility.
    2. Boot into Windows installer
    3. Format drive to NTFS in installer
    4. Install
    5. Reboot
    6. Complete install (often locks here as mentioned above)
    7. If install completes, boot in to Windows
    8. Change some system settings -- temp folders, workgroup name, pagefile size, turn off UAC
    9. Reboot
    10. Install Boot Camp Drivers from DVD (sometimes locks before I get here)
    11. (have tried sometimes) Uninstall nVidia drivers installed by Boot Camp
    12. Reboot
    13. Consistent locks
    14. Start over, both trying a new Partition creation (delete old, expand Mac partition, recreate new partition via BCA or DU), or using existing partition
    Some other details:
    1. This environment seems to work fine via Parallels. I only appear to have locking issues if I try to boot natively.
    2. The Boot Camp Drivers installs a bunch of driver crud for devices the machine does not have. Could this be causing an issue? Drivers that Windows is attempting to load and failing, thus causing issues?
    3. I'm mainly trying to install x64 due to the 4GB of RAM -- could this be an issue with the Boot Camp Drivers and Windows x64?
    4. I don't think this is a motherboard issue as the machine is completely stable otherwise -- I'd think that if it was a hardware failure, especially with the motherboard, I'd see other issues elsewhere..
    Any ideas??? I'm pulling my hair out on this one. The machine is brand new (just got it Tuesday last week, just started using it this last weekend). My last MacBook Pro (previous model 13" MBP) took Windows 7 without a single hitch and ran smooth as butter.
    Help?

    i just got some solution for this... dunno whether it will work for u all... anyone who got the same problem... try to press power button right on welcome screen after u type ur username and password.... or if u didnt put a password, u can press the power button right on the welcome screen... and then it will automatically to the lock state... and then u login again... the windows will appear... i work it on windows 7 64 bit... if u can login for that 1st time, for next time, u dont need to press the power button anymore... for me it works... XD
    cheers...
    gud luck for u all... =)
    Message was edited by: conggg

  • Best way to organize photos and home video

    Looking for the optimum way to organize photos and home videos. Goal is to have all devices have equal access to all the media. Just as important is for this to be as easy as possible to maintain and use.
    Have an iMac with iTunes and iPhoto
    Have multiple iOS devices with different family members each of whom take photos and videos with their devices
    Have multiple Apple TVs
    iMac will be the hub with wifi synching
    Right now about 30,000 files probably 1000 are videos (and growing)
    I don't do any editing of photos or videos
    Current set up is through iPhoto. I am importing all images not using the reference option. This worked fine until we started taking more videos which made for a large 200gb iPhoto library which barely loads even with 16gb ram.
    I want to use iPhoto not iTunes because I keep everything organized by event, that includes photos and videos. And I don't want 2 separate organizing options to keep track of.
    Most of my library is also available outside of iPhoto, I have a back up so I could use the reference feature of iPhoto which would mean a smaller library. However we use photostream on all iOS devices which has the feature that automatically imports into iPhoto. So I don't have to worry about those photos - it's easy. But then i would have some media copied into the library and some referenced. This doesn't work for videos so I will need to copy them.
    I have a copy of iPhoto library manager software which could solve the excessive library size. That's cumbersome because I have to sync the correct library with the iOS devices and also make sure photostream is In the right library. I have only used this software a little so I could be wrong about this solution.
    I appreciate any ideas that might work. Of course let me know if I am missing something, I am sure I am.
    Would Aperture be a better solution? Some other software?

    1 - do not change to a referenced library with iPhoto - it has many problems and should not be done - if you switch to Aperture it is ok - Aperture handles referenced libraries fine
    2 - iPhoto does not load your entire library and while 200 GB is large it is not giant and is not causing you an problems - if you ave problems please describe them and  will try to help with thme -it is not libary size - My library is 263 GB and I have 8GB of RAM and there are no issues
    3 - not sure what your constraints are but for me having all IOS devices logged into iCoud using the same Apple ID work best - the downside is that we all see the same things - which is our desire and I think yours
    LN

  • Is it possible to export, and include AE compositions used within a Premiere Pro project, via project manager?

    I've several Premiere Pro CC 2014 projects I want to relocate via project manager (one at a time). They're excessive in size (many long and unused footage files), so "Create New Trimmed Project" is the setting-option, within project manager, most attractive. BTW I love how project manager does this... However, when I check on its new location, I don't see any linked AfterEffects files. Fine, I try the other setting "Collect Files and Copy to New Location". This option allows me to now copy everything including rendered scenes featuring my AE compositions (and a lot of excess, unused files)... but still no AE files!
    Is it possible to export a Premiere Pro project, excluding unused files, yet include all used files... especially linked files such as AE compositions?

    Though not exactly the solution I seek, I've tried an alternative approach... it works (if anyone's interested)... but my original question still stands.
    Alternative approach (and probably the only option now):
    When I copy & trim my Premiere Pro project via project manager to its new location, I can open the new file and see the track editor now includes broken links to AfterEffects compositions. Since I don't wish to relink these to the current .AEP file, I may package** the original AE project to a location near my newly-located Premiere Pro project folder. To do this, I open the original .AEP file containing any compositions linked to my original Premiere Pro project, and use File < Dependencies < Collect Files. This feature copies all files (and a new .AEP copy) from this AE project to my newly selected destination (as long as no files are missing).
    So at this point, when I open my newly-located PP project, I can relink the broken AfterEffects composition links in PP to the new .AEP file. Yay!
    **package (for those unaware) is the term used within Adobe InDesign. This feature gathers all linked files that makeup the overall design and layout, and copies into a new folder.

Maybe you are looking for

  • How can i get and save the name of the file

    Hi all in my program i'm dealing with several text files each has its own name after doing some process on each i do save the output as output.txt can i add the file name before or after the output.txt i.e i want the name of the file to be part of th

  • How to give tooltip on column heading of report

    Hi im using seibel analytics, i want to give tooltip on column heading of the report. How is it possible. Thanks in advance...

  • Master data replication from R3 to CRM

    i have pulled the master data from R3 to CRM .data is showing in table but000 and crmm_but_custno but it is not showing in the BP transection. when i am searching in BP it is not showing but i can see in the table(but000 and crmm_but_custno) What sho

  • Making a smart or ruels driven mailbox?

    I would like to have a mailbox that captures all email to and from a certain person. I tried making one with rules, but it does not seems to capture all mail for the person. I tried making a smart mailbox and that does not seem to work as well, or at

  • Sustain not working for first bars of exported track

    I recorded a song using the onscreen keyboard and the sustain key. All tracks play fine with sustain in GB, however when I export the track the first few bars play without the sustain. I have no idea why this would be since it's a loop so all bars sh