PLL Library Size - Few Larger Libraries Or More Smaller Libraries

Hi
I am interested in people's thoughts on whether it is better to have fewer larger libraries or more smaller libraries. If you do break into smaller pll libraries, should the grouping be that some forms will not use modules in library, ie do not have to attach to all forms. For common modules that all forms require access to, do you achieve anything in having many little libraries, rather than on larger library?
Is it correct that pll libraries are loaded into memory at run time when form is loaded and library is attached?
What are issues to consider here
Thanks

Hi Till,
My honest opinion...do not merge the libraries. Switch between them using iPhoto Library manager and leave it at that.
A 22 gig library is way too big to run efficiently.
Lori

Similar Messages

  • Few large nodes or many small nodes

    Hi guys,
    In general, what option is better to implement a RAC system; few large nodes or many small nodes?
    Say we have a system with 4 nodes of 4 CPU and a system with 8 nodes of 2 CPU. Will there be a performance difference?
    I understand there won't be a clear cut answer for this. But I'd like to learn from your experiences.
    Regards,

    Hi,
    The worst case in terms of block transfer is 3-way, doesn't matter if you have 100 nodes a single block will be accessed at max in 3 hops. But there are other factors to consider
    example if you're using FC for SAN connectivity I'd assume trying to connect 4 servers could cost more than 2 servers.
    On the load let's say your load is 80 (whatever units) and equally distributed among 4 servers each servers will have 20 (units). If one goes down or shutdown to do a rolling patch then load of that will be distributed among other 3 so these will have 20 + 20/3 = 26.666. Imagine the same scenario if there was only two servers then each will have 40 and if one goes down one server has to carry the entire load. So you have to do some capacity planning interms of cpu to decide if 4 nodes better or 2 nodes better.

  • Dual monitors: window size too large when opens in smaller monitor.

    I use Windows 7 with two monitors. When I open Firefox directly from my desktop, it opens on the larger monitor, and the window size is just fine. But if I click a link from an email, Firefox opens in the second, smaller monitor, and the window is too large for that screen.
    Please help me figure out how to either (a) make Firefox open with a smaller window; or (b) make Firefox open on the primary, larger monitor.

    oh,  forgot to say Mavericks   10.9.5    running on the mac pro..

  • More smaller disks or fewer larger ones?

    Hi:
    We're in the middle of procuring disks for a new Oracle server. We plan to stripe disks to randomly distribute I/O load across disk heads. There is some debate on whether or not we should use fewer large disks vs more smaller ones. For example, the datafiles for one schema will consume 432G. In terms of performance, would it be better to stripe 6-72G disks or 12-36G disks? Are there any other pros/cons for one scenario vs the other?
    Thanks in Advance !

    Generally, the more heads you have working the better so more smaller disks would be the way to go. Additionally, with smaller drives, recovery from drive failure would be faster. Of course the major drawback to this approach is that you will need bays for the drives and if you were looking at Server attached storage, you may not have enough drive bays available.
    Another consideration would be the storage technology you are considering; for example, the HP EVA SANs require a minimum of 8 drives for a drive set so in your example you would need a mimimum of 8-72G disks whether you needed the capacity or not.

  • Hi all! What is the best way to create the correct space for baseball jersey names and numbers? along with making sure they are the right size for large printing.

    What is the best way to create the correct space for baseball jersey names and numbers? along with making sure they are the right size for large printing.

    Buying more hard drive space is a very valid option, here.  Editing takes up lots of room, you should never discount the idea of adding more when you need it.
    Another possibility is exporting to MXF OP1a using the AVC-I codec.  It's not lossless, but it is Master quality.  Plus the file size is a LOT smaller, so it may suit your needs.

  • Repair itunes library (damaged) file. Multiple iTunes libraries on an external harddrive. All libraries have been working fine. Do some "housecleaning" on a one of the large libraries 296GB. iTunes quit and now has damaged file. 296 GB still on HD.

    How to repair itunes library (damaged) file. Multiple iTunes libraries on an external harddrive. All libraries have been working fine. Do some "housecleaning" on a one of the large libraries 296GB. iTunes quits and now has damaged file. 296 GB still on HD.

    The duplication is unnecessary. Exporting creates a duplicate of the file. So now you'll have a duplicate of a duplicate. Exporting is not "working on" a file.
    No it's not merging, it's exporting from one to the Finder and into another. No matter what lose something. If you export the Original you leave behind all the work you've done in iPhoto. If you export anything else, you lose the non-destructive editing and the ability to revert to the original. With merging you preserve that work. Yes, you can trash the old Library when you have completed the manoevre but no it's not the same thing as merging.
    This User Tip
    https://discussions.apple.com/docs/DOC-4921
    has details of the options in the Export dialogue. But in brief:
    Current gets you the iPhoto Preview, used for sharing via media browsers. It's a jpg, medium quality missing metadata. Original gets the file you imported, unchanged and then you can export different version of the current version at different qualities. If you choose to export anything except the Original you do not get a Raw. There's no such thing as an "edited Raw", and you lose the connection between the original and the exported version. That means you've taken a non-destructive workflow and turned it into a destructive one.
    The Tiff will certainly be higher qulaity and less likely to suffer generational loss in future editing but the file sizes are vast, often more than 10 times the size of the jpeg.
    Put it this way, it would be cheaper to buy Library Manager than the disk you'll need to contain all the Tiffs. Unless you plan on a lot of editing, I'd go for a high quality jpeg as a reasonable compromise.
    To be clear:
    So, I guess my new question is, how can I edit the Raw image and keep it in a RAW format that IPhoto recognizes so that I can reprocess without any loss at a future time, or is it that once you edit a RAW image, then it is no longer a RAW image?
    Once you edit a Raw it can no longer be a Raw. End of.

  • Is Aperture more stable w/ larger libraries (esp. Faces)?

    I have a large library of nearly 30,000 photos that has been with me essentially since 2005.  I've made several attempts at going through and tagging faces, but continue to find iPhoto slow and that it occasionally forgets faces that I've tagged.  I've gone through events as many as three times only to come back in a few months to find that there are faces that I *know* I tagged showing up as untagged.
    I'm wondering if Aperture's more professional nature and that it's developed on a more modern 64 bit codebase would mean that it would be more stable with Faces in larger libraries?  This is my personal photo library, but I am quite familiar with Apple's more professional offerings in Final Cut Pro X, which I use nearly daily.

    +1 to William's post.  I kept Faces off until 3.1.3 (iirc).  It's been stable since then.  Afaict, if you don't use Faces even though it is enabled, it doesn't slow you down much (and then, only at import).  On large Libraries (more than 100,000 Images) Faces runs a little bit slow.  When using Faces, it is best to use it at the Project(s) level rather than the Library level.  (Léonie has posted some helpful hints on the forum -- try a search.)  And of course many problems and confusions can be avoided by reading the User Manual.  Here is the section on Faces.
    All-in-all, I find it only moderately more useful than simply keywording Images with the names of people in them.
    IME, it has been completely reliable since I turned it on (no lost or misplaced data).  I suggest running "Repair" and perhaps even "Rebuild" on your Library before turning Faces on.  Back-up first (always, but just a precaution; I have yet to hear of a single case of Libraries being vitiated by either action.)

  • I used scripting brigde to add a movie that has size bigger than 5GB, exactly after two minutes iTunes return a failed, but the processing of the file is actually added to iTunes Library successfully. The copying take more than 5 minutes to complete. Why?

    I used scripting brigde to add a movie that has size bigger than 5GB, exactly after two minutes iTunes return a failed, but the processing of the file is actually added to iTunes Library successfully. The copying take more than 5 minutes to complete. Why the iTunes Scripting Brigde returned failed when it is actually success? It occurred exactly 2 minutes after submit the request to Scripting Brigde. Is this 2 minutes related to the Apple Event time out? if it does, how do I get around this problem? thx

    I can tell you that this is some of the absolutely worst customer service I have ever dealt with. I found out from a store employee that when they are really busy with calls, they have third party companies taking overflow calls. One of those companies is Xerox. What can a Xerox call center rep possibly be able to authorize on a Verizon account?  I'm Sure there is a ton of misinformation out there due to this. They don't note the accounts properly or so everyone can see them. I have been transferred before and have asked if they work for Verizon or a third party also and was refused an answer so, apparently they aren't required to disclose that information. I spent a long time in the store on my last visit and it's not just customers that get the runaround. It happens to the store employees as well and it's beyond frustrating.

  • My iTunes library is too large for my iPod classic--can I create a separate library so that I can split the current library and sync the distinct libraries onto separate iPods?

    My iTunes library is too large for my iPod classic--can I create a separate library so that I can split the current library and sync the distinct libraries onto separate iPods?

    - If the Gmail accounts are IMAP one then if the library is erased and then you use the account old email will be redownload
    - Easing the library on the MacBook does not effect what is on the iMac. On the other hand if you deleted the messages in the Mail app on the MacBook then they will deleted from all devices.
    -See the middle of the following. It would seem to help in your situation
    http://www.howtogeek.com/209517/how-to-stop-your-macs-mail-app-from-wasting-giga bytes-of-space/

  • Using Sharepoint Client Object Model with large libraries (more than 5000 elements)

    Hi,
    i have issue with acccessing large list with more than 5000 elements. I use a Sharepoint Client Object Model. 
    It worked till yesterday when in the list was less than 5000. Now there is in the library more than 5000 elements. I have changed in the Central Administration limit for list from 5000 to 10000, but it helped only for list views. Script returns now 0 records.
    var clientContext = new SP.ClientContext.get_current();
    var currentListId =_spPageContextInfo.pageListId;
    oList = clientContext.get_web().get_lists().getById(currentListId);
    var camlQuery = new SP.CamlQuery();
    camlQuery.set_viewXml('<View Scope=\"RecursiveAll\">'+
    '<Query>'+
    '<Where>'+
    '<And>'+
    '<Eq><FieldRef Name=\"ContentTypeId\"/><Value Type=\"Text\">'+contentTypeId+'</Value></Eq>'+
    '<And>'+
    '<Eq><FieldRef Name=\"_x0040_Client\" LookupId=\"TRUE\" /><Value Type=\"Lookup\">'+client+'</Value></Eq>'+
    '<Eq><FieldRef Name=\"_x0040_Company\" LookupId=\"TRUE\" /><Value Type=\"Lookup\">'+organisation+'</Value></Eq>'+
    '</And>'+
    '</And>'+
    '</Where>'+
    '</Query>'+
    '<ViewFields>'+
    '<FieldRef Name=\"ID\" />'+
    '</ViewFields>'+
    '</View>'+
    '<RowLimit>100</RowLimit>');
    camlQuery.ViewAttributes = "Scope='RecursiveAll'";
    listItems = oList.getItems(camlQuery);
    clientContext.load(listItems);
    clientContext.executeQueryAsync(Function.createDelegate(this, onQuerySucceeded), Function.createDelegate(this, onQueryFailed));
    Please advice how i can resolve this issue. Thanks a lot!!
    from MSDN forum...

    Thanks you, friends!
    That to resolve my issue, i made next things:
    1) turn off a limit items for my list
    2) used SPQuery.ListItemCollectionPosition
    property,
    3) and corrected little mistake in my CAML Query.
    May be a punkt #2 will be enough, don't you :)
    from MSDN forum...

  • Dual library question (iTunes gets the library size and bitrate wrong)

    I have two libraries, one ripped in Lossless, and the other in AAC at 256 kbps. I use one for streaming, and the other for syncing to our iPods. I just switched from the Lossless library to the AAC library (kept on separate drives). iTunes now sees the AAC files, but incorrectly lists all the bitrates as their higher Lossless numbers. However, when I click on an individual song to play it, the bitrate appropriately is recognized as 256 kbps. When I do this, the total library size (listed at the bottom) also incrementally "shrinks."
    The problem is that I just purchased a new 80 GB iPod capable of holding the entire library (nearly 8000 songs@25 kbps, ~60 GB total size). Unfortunately, iTunes still thinks the library is much larger (using the incorrect lossless bitrates to calculate file size). This results in it telling me (erroneously) that the library is too large to sync.
    Is there an easy way to get iTunes to realize the new file sizes? I could obviously double-click on each and every song in the library until iTunes "gets" the real size, but I assume there's a better way. I also assume that this will be an issue every time I switch between the two libraries.
    Help, anyone? Thanks in advance.
    KK
    iMac G5, Mini, PB G4   Mac OS X (10.4.6)  

    Thanks for the tip, but this didn't work, unfortunately. Following your lead a bit further, I even created a new playlist and dragged the whole library there, but the wrong bitrates (and thus, playlist size) still appeared.
    One more curious update regarding this situation. iTunes automatically populated my iPod with what it believed was a subset of my tunes that would fit on the iPod. The real size of this subset was ~20GB, but iTunes thought (using the Lossless size) that it was nearly 60 GB. In categorizing the contents of my iPod, it listed ~60GB of songs, ~1 MB of photos, 1 GB of video and still listed the space available as ~50 GB (for a grand total of ~112 GB of space on my 80 GB iPod). Clearly, iTunes used the calculated size to estimate the music content, but used the actual space remaining based on a query of the iPod drive. Strange, huh?
    I'm still looking for advice--any other suggestions would be welcome.
    KK

  • Best storage set-ups for large Libraries?

    Let's say "large Library" means 500 GB and/or 500,000 Images, and above.
    What set-ups for storage, on a Mac maximized with RAM, will provide the best Aperture performance?  The best bang-for-the-buck?  Assume Thunderbolt is available.  I don't know much about hardware: RAID, hybrid drives, etc.
    What sensible considerations should one make for backup and data-redundancy?
    How can I determine if the storage media is a bottleneck in the performance of Aperture on my system?
    I run most libraries off of external USB-3 drives mounted (sometimes directly, often via powered USB-3 hubs) to my
    MacBook Pro
    Retina, 15-inch, Early 2013
    Processor  2.7 GHz Intel Core i7
    Memory  16 GB 1600 MHz DDR3
    Graphics  NVIDIA GeForce GT 650M 1024 MB
    System Drive  APPLE SSD SD512E
    This works well for small and medium-size Libraries, but for large Libraries, I'm spending costly time waiting for the system to catch up.
    Some of that, demonstrably, comes when I run a three-monitor set-up.  (But this provides some welcome advantages to me.)
    Additionally, some back-ups and database repairs now take 12 hr. or longer.
    Thanks.

    Thanks William,
    I kept my c. 2011 MPB and use it for making back-ups, which is done automatically after I leave for the day.  My early-2013 MPB is so much faster (and has such a higher-resolution screen) that I don't use the older computer at all.
    William Lloyd wrote:
    Probably the fastest storage you can get is the La Cie "little big disk" which is a pair of 512 GB SSDs with a Thunderbolt 2 connection. The issue is, only the newest Mac Pro, and the newest retina MacBook Pros, have Thunderbolt 2 right now.
    OWC tech explained to me that TBolt2 allows 2x the throughput, but that the drives have to be able to provide it.  TBolt1 should provide 1,000 MB/s (do I have the units correct?), which is faster than most drives can provide.  So the bottleneck, for me, isn't likely to be the port, but rather the drives.  USB-3 can move 500 MB/s, which is still faster than -- afaict -- what my WD Passport drives can provide.
    As I currently see it, I need faster throughput, and I need to either trim my large Libraries or find a way to manage them so that the regularly used files are more speedily available.
    Regarding faster throughput, an external SSD comes to mind.
    The problem, for me, is that the large Libraries are close to 1TB (with Previews).  While I don't expect them to grow (the data acquisition phase is, for now, done), it would be short-sighted to assume they won't.  That brings up the second consideration, which is how to best use spanned drives that contain an Aperture Library.
    As I see it (with my limited understanding of hardware), what I really want is _a lot more RAM_, or, barring that, a huge SSD scratch disk.  I have 200 GB free on my system drive, which is an Apple-supplied SSD, but it doesn't seen to be used as effectively as I'd like.
    WD is now selling a new portable TBolt SSD, the "My Passport Pro", available with 4 GB of storage, and with a throughput of 230 MB/s.  My current largest Library is on WB Passport drives, whose throughput is not faster than 480 Mb/s (Max) (I assume BITS, not BYTES, so 40 MB/s).  That's a huge difference, while still only 1/4 of the speed possible with TBolt1, and 1/8 the throughput possible with TBolt2 (which my early-2013 MBP does not have, afaict).
    These are the questions I am trying to answer:
    - How can I measure my current throughput?
    - Can I determine the bottleneck in my use of large Libraries in Aperture?
    - Will new storage give me faster performance?
    - Which storage will give me the best "bang-for-my-bucks"?  (The La Cie "little big disk" seems to have it's bang limited by my computer, which greatly lowers its quotient value.)
    In short: how can I get 900-1000 MB/s throughput on my machine, and is there any way to set up my Library so that, even though it is close to 1 TB, I can use a smaller very fast drive for the most read/written files?
    --Kirby.

  • Aperture Library Size slowing iMac even when Aperture is closed

    I recently purchased an iMac 22 quad core running Lion, with all updates.  It currently has 12GB of Ram, and 190GB free on the hard drive.
    The iMac would load programs like Outlook within a few seconds and was very fast.
    I purchased Aperture and migrated about 38,000 images, using the exisitng file structure from my PC.  This took about 12 hours yo process, update images, etc...
    Even if Aperture is closed, all programs now load much more slowly.  Outlook can take 20-30 seconds to open, and lists of email and contacts now take as much as a minute to load. 
    This is only true for the user account with the Aperture library, as programs for other users seems to work fine.
    I tried running disk utilities, restarting machine, but no difference.
    Activity monitor shows no unusual memory, CPU or disk activity.
    I started a new user account, and all worked fine again...until I again created an aperture library, and again, even if Aperture is not started, everyting else runs slowly.
    Is there a file indexing feature that is overwhelmed by the addition of a large aperture library?  (Deleting the Aperture library doe snot solve the problem-only startign a new user accountdoes).

    dphughes wrote:
    ...in both cases maintaining referenced files stored outside the catalog/library. The Aperture library is 500% the size of the Lightroom catalog - even with restricted preview sizes and no vault.
    If this is true, there must be a logical explanation that is giving Aperture users some benefit for the increased Library size. Sorry been a long time since I looked at Lightroom. Does it store the same size previews and thumbnails as Aperture? Since you are using referenced files and Aperture is using non-destructive edits and even new versions do not create duplicate files (only a new set of pointers to the referenced masters), one would think previews and thumbnails must be handled differently. Since you use both, what could it be?
    The Aperture Vault is a separate library than the Aperture Library so having a Vault of course should not impact the size of the Aperture Library.

  • Why has the total library size gone from my itunes

    I downloaded the latest iTunes the other day and now the total library size is missing and the window size is actually too large for my monitor too, though I've not altered any settings.  
    I HATE not having the library size showing at the bottom of the page when my iTunes library is open.  Anyone know what's happened?

    Empty/corrupt library after upgrade/crash
    Hopefully it's not been too long since you last upgraded iTunes, in fact if you get an empty/incomplete library immediately after upgrading then with the following steps you shouldn't lose a thing or need to do any further housekeeping.  Note that in iTunes 11 an "empty" library may show your past purchases with links to stream or download them.
    In the Previous iTunes Libraries folder should be a number of dated iTunes Library files. Take the most recent of these and copy it into the iTunes folder. Rename iTunes Library as iTunes Library (Corrupt) and then rename the restored file as iTunes Library.itl. Start iTunes. Should all be good, bar any recent additions to or deletions from your library.
    Alternatively, depending on exactly when and why the library went missing, there may be a more recent .tmp file in the main iTunes folder that can be renamed as iTunes Library.itl to restore the library to a recent state.
    See iTunes Folder Watch for a tool to catch up with any changes since the backup file was created.
    When you get it all working make a backup!
    Should you be in the unfortunate position where you are no longer able to access your original library, or a backup of it, then see Recover your iTunes library from your iPod or iOS device.
    tt2

  • Backup strategy when library starts getting LARGE

    I have been using Vaults with my (managed) library, backing up to external 80GB firewire drives. My library is now approaching 75GB and will soon exceed the capacity of these little drives.
    What are some solutions people implement for backup up large libraries using vaults? My main concern is to always have an off-site backup (ideally two)

    I have two External 400 GB drives in an external SATA case, RAID 1 (mirrored with OS X software RAID). I also have a third 400GB drive that I swap out with one of those two drives every week or so, and keep offsite.
    The mirrored RAID provides for instant backup in terms of data recovery if a drive fails. The third drives provides for true backup in case I get really clumsy with delete or a cat runs across my keyboard, forever removing inumerable files in the process (pretty hard to do, but you never know when you might make a change you can't recover from).
    I do not use Vaults at all; they are a good idea but I think beyond a certain size mirrored drives are more instant a backup and a little easier to manage and even recover from (no library rebuild if you just have an exact duplicate of your library there). They are more complex to set up though so for a lot of users the Vault makes much more sense.
    Looking forward all that may change with Time Machine - I think what will happen is that the next major version of Aperture will replace the concept of the Vault with some kind of Time Machine integration. This would give you the instant backup of changes that I like about mirrored RAID, while also allowing you to skim back through the history of adjustments in any version, something you cannot do today. I think then what I would do is have a primary drive for the Aperture library, a larger secondary drive for Time Machine (it needs to be larger to hold data and revisions), and a third drive in another external enclosure that I hooked up once a week and do an clone of the Time Machine drive with (using SuperDuper).
    I actually have a second 300GB mirrored RAID set that acts more as offline storage but I didn't want to confuse things bringing that up.

Maybe you are looking for

  • Problem in sending attachment using "SO_NEW_DOCUMENT_ATT_SEND_API1"

    Hi All, I have a requirement of sending the data available in an internal table via email. I need to populate this content into excel sheet and send it as an attachment to recipients. I am using the following code for that. REPORT  z51515t_doc_att. D

  • [SOLVED]Swapping from xf86-video-ati driver to catalyst

    Hey everyone, So as the title implies, I need some help on this.  I tried doing a pacman -R xf86-video-ati, then setting my xorg over to vesa as the driver prior to installing the catalyst driver.  That part went fine.  Installed catalyst which updat

  • Windows 8.1 pro VHD on Thumb drive

    Hello, I recently created a Full operating licensed version of Windows 8.1 pro on a 64gb thumb drive as a VHD using a Windows 8.1 standard laptop. My question is does this VHD (thumb drive) meet the requirement of 1 computer under the terms of Micros

  • Org Unit Field in IT 0001

    Hi Folks- The Org Unit field in IT 0001in PA is not capturing the ORG unit stored in OM.Integration is active between OM and PA currently.The Job and position fields are carrying over without any issues. Used RHINTE30- and the org unit is allready th

  • Reg: View cluster

    Hello All, How to find out the view cluster associated with a view ?