Is there a practical limit to catalog size?

I get frequent enough requests to use old photos so I keep everything in one catalog, and all raw files on disk in directories by year\day.  Currently there are several hundred thousand files (about half unedited), and the catalog size at last check is over 4gb.   I just keep adding adding primary and backup disk drives when needed because they are relatively cheap (IBM has announced the first exabyte array so that trend should continue).  The only problem I've seen is the catalog is slow to backup but nothing that cant be worked around. The access to photosm search, or editing has not seemed to have slowed at all.
My question is: Has anyone run into a problem with this approach?

What worked for me was to turn off the auto normalise preference in Garageband.
See this thread -
http://discussions.apple.com/thread.jspa?threadID=1403099&tstart=0
Hope it works for you

Similar Messages

  • Is there an upper threshold of catalog size that shouldn't be crossed?

    so here's a quick version of my workflow:
    Shoot a job,
    Download cards via Image Capture to a pocket drive off my laptop and if the shoot is more than a one day ordeal,
    Import into Lightroom and have it render 1:1 previews of the day's bounty at the hotel at night, otherwise...
    Transfer the Canon 21mp CR2 files to the 4 drive RAID-0 on my primary machine and either import & build previews into a new job catalog there or if already built, transfer the catalog and previews off the pocket drive to that same RAID-0
    edit
    rename and organize
    process out to Tiffs for client and build web indexes
    convert any selected files to DNG, discard unstarred raw files
    move the catalog, preview, web and DNG files to my mirrored archive (online/offline depending on what drive is mounted)
    import the job catalog into a master catalog that resides permanently on both my primary and secondary MacPros.
    Images viewed in the master catalog are most likely offline and are viewed via thumbnails and 1:1 previews, the processed Tiffs get killed from my 6TB RAID after 30 days time from delivery of the job to client.
    I'm in the process, as I have time, of applying this process to archive drives of past shoots that had originally been cataloged with Aperture, or not cataloged at all.
    As of this writing, I have 17 individual, 1TB & 1.5TB hard drives containing RAW & DNG files from 16.7, 21, 22, 39 and 60 megapixel cameras, that each have mirrored backups stored offline and off premises.
    Easily 300,000+ unique images.
    Is one .lrcat library referencing one lrdata set of previews going to be left standing when I'm finished?

    While I have no experience with such massive amount of data in Lightroom, I can tell you that there is no upper limit built-in to the software that would prevent a catalog from operating on 300,000 images. You give us great detail about your hard drive set-up, and the speed of these drives certainly does affect catalog performance in LR, but the computer specs also affect catalog performance, and the only way to determine if it will be fast enough for you, is to try a large catalog. I also point out that processing 21mb originals (and larger) will be a source of slowness that is distinct from the size of the catalog.
    I personally do not like multiple catalogs for a variety of reasons, but I am not a professional photographer. I do acknowledge that if your photographs can be split into non-overlapping sets (such as personal and work; or wedding photos vs landscape photos), then multiple catalogs certainly makes sense. Some pro photographers use one catalog per job, and other pro photographers use one large catalog.

  • Is there a limit for the size of video in iCloud?, is there a limit for the size of video in iCloud?

    I recently updgraded to the iPhone 5S from the iPhone 5. Prior to using the the 5S i backed up the contents to my 50GB iCloud storage.
    Unfortunately I have found that some of my videos were not downloaded to my new phone.
    Is there a limit on size that icloud will accept? I have I lost my videos?
    Cheers
    redman69

    The number must be stored in some internal structure
    like an array. And there's a limit on the size of an
    array, because the maximum index you can use is
    Integer.MAX_INT. So it follows that there must be a
    maximum BigInteger value too.Why can't it be split into 2 arrays when it overflows the first? I don't know the actual implementation, but it could be of relatively unlimited size.

  • Elements 4.0 Organizer - Catalog Size?

    Hey all,
    I am about to organize a lot of digital scrapbooking files and I am considering doing it in my current pictures catalog.  I will put an old date on them and tag them so they will not interfere with my photos.  I am approaching 10,000 items in my current catalog.  Is there a practical limit on the number of items in a catalog.  I can always start another catalog, but I had rather not if I can get away with it.
    Thanks in advance,
    Rob Hix

    There is no firm limit on catalog size. Most likely, it depends on your computer's memory and speed, and your tolerance for the appearance of speed or the appearance of slowness. Catalogs of 65K have been reported running well.

  • Catalog Size (PE6)

    Is there a limit to the size of the catalog in PE6. My Catalog is approaching 178 MB and thumbs cache is nearing 1.5GB.
    The Organizer Keyword mapping function is responding very sluggish even after performed a Catalog repair and optimazition.
    Any suggestions on a fix?

    No, there isn't any limit on catalog size. Depending upon the number of assets in catalog, size of catalog keeps on growing.
    Just curious to know following-
    - As thumbs cache size is 1.5 GB, how may assets do you have in catalog?
    - How come, catalog size is 178 mb and thumbs cache size is 1.5 GB (I guess catalog size you mentioned is for catalog.pse6db file only)?
    ~Andromeda

  • Is there a time limit for a podcast to publish on iweb?

    I am creating a podcast in Garageband.  I would like to do it in a series of 13 minutes for each lecture.
    Is there a time limit or file size restriction to publish on my website in iweb?

    I don't think so (provided you have sufficient space on your host), but to get a better answer, you should ask the question in the iWeb forum.
    (this is a mirrored question, somehow it got posted twice, so if you want to respond, try responding to the other thread and let this one drop into forum oblivion..)

  • Is there a practical size limit on JPEG format image files?

    I have noticed when working with Photoshop v7.0 on Windows XP that, when creating a JPEG format image file from a very large scanned image (more than 800 Mb, maybe), the .jpg file will usually be saved on disk at any of the possible compression ratios available, without any reported error, yet when the saved file is re-opened, an error is reported - something along the lines of "missing or corrupted end of file marker" - and the image cannot be loaded.
    If before saving the original image is substantially reduced in size, e.g. by reducing the resolution of the image, a viable JPEG format file can be created, which will subsequently load successfully, though technically with data loss.
    This appears to happen only with JPEG format files - PDF format files can be saved from the original image at similar JPEG compression ratios, without error on reloading.
    I do not know if this also occurs with any other Photoshop versions.  I am working with Adobe Photoshop v7.0 on Windows XP, although we are preparing to upgrade to CS (which version I don't know) in the near future, for different reasons to those which are detailed above.
    Does anyone know if there is a technical explanation for this behaviour?  Is there a file size limit for writing viable JPEG files (which Photoshop does not report as an error when exceeded)?
    Thanks

    Chris Cox wrote:
    Where did you get the idea that we weren't doing something about it?
    that's what I gathered from "JPEG cannot [...] in Photoshop (code difficult to replace)".
    But if you say this is in the works right now, that's good news indeed.
    Chris Cox wrote:
    And where did you get the idea that posts get deleted? (other than SPAM)
    Are you sure you're reading the same forum as the rest of us?
    That's what happened:
    The content you posted below was reported for abuse and removed by our
    moderators:
    Subject: Re: Is there a practical size limit on JPEG format image files?
    Posted: 4/29/11 9:11 AM
    Anyway, you gave an honest answer on this technical issue, Chris, and that's what counts.

  • Are there practical limits on elements organizer catalog size?

    I have 90K items in my PSE11 organizer catalog. I'm running Win8 and I'm getting nervious that catalog size will become an issue. Backing up the catalog is cumbersome since it is 337G. If I need to break it up into smaller pieces, what is the best way to do this? Is there a bridge or link between catalogs? Thanks

    See:
    http://www.johnrellis.com/psedbtool/photoshop-elements-faq.htm#_Limits_on_the
    and:
    http://www.johnrellis.com/psedbtool/photoshop-elements-faq.htm#_Splitting_and_rearranging
    and:
    http://www.johnrellis.com/psedbtool/photoshop-elements-faq.htm#_Merging_Catalogs
    The only real problem with big catalogs is what you have found : it's long and tedious to do backups. From the above links you can see that it's a bad idea to split catalogs, unless there is nothing common within them. Contrary to Lightroom, the organizer offers no satisfactory solution to merge catalogs. And the size of the catalog has no real impact on the speed of your organizing.
    My suggestion would be to think about alternative backup procedures : anyway, if you want to upgrade to Lightroom, you'll have to choose your own backup system for picture files. LR only offers backups for the catalogs (database).
    - Once you know where your big catalog is stored (menu help / system info) it's easy to create a desktop shortcut so that you can save the catalog itself. That's very quick compared to the backup of all your files (several hours).
    - Consider using PSE incremental backups. I am not fond of them, but they are acceptable if you master the process : they save the database fully each time and let you restore to a given backup sesssion. Many users rely on incremental backups without understanding the process and are unable to do a complete restore in case of a big problem.
    - Consider using external backup tools (as if you were with Lightroom). They can be easier and they offer automatic and periodic backup workflows.  I am using Synctoy with Win7 and do periodic clones of my drives with Acronis
    - Continue to use the integrated backup procedure when you have to change OS, computer, drives... and migrate all your pictures and catalog.

  • Is there a limit on the size of the input for the Solve Linear Equations block?

    Hello,
    I'm trying to figure out why the Solve Linear Equations block will properly function with some sets of data and why it won't with others. What my program is doing is taking a signal and comparing it with a batch of sine and cosine waves to try and pick out patterns in the data. I have different sample sizes and it seems to work when I throw 3900 points at it. However, I have another set with 4550 points and it gives me incorrect amplitudes for my sinusoids.  Is there some limit to the size of the matrices that I can give this block? Or is there some other workaround that still allows me to keep all of my data?
    Thanks,
    David Joseph

    Well, the best way to show what I expect is to see the entire program. It's pretty evident that when looking at the graphs, something isn't right. What is supposed to happen is that the runout amplitudes are found, and then those sinusoids are subtracted from the initial data, leaving tooth to tooth data and noise. When I use the larger arrays, it seems as though not all of the data gets through (count the peaks on the product gear runout graph vs. initial) and the amplitudes are much to small, such that nothing is really taken out and the tooth to tooth data looks like the initial data.
    Also, we will also be using an FFT, but it will be limited to only determining the frequencies we should check. I've fought with the fft blocks quite a bit and I just prefer to not use them. Plus, the guy I'm writing this for wants exact answers and does not want to pad or resample the data or use windows.
    The exact number of data points isn't important (ie. 4550 vs 4551) since I use the array size block to index the for loop.
    As for typical values, they can change a lot based on materials. But, the original 3900 data point sets and the 4550 data point sets used practically identical gears. So, use the original 3900 sets I've included as references (check the RO array block numbers to compare).
    I've included 3 3900 samples, 3 4550 samples, and 3 4550 samples that have been truncated to 3900 or so as constants on the block diagram.
    Also, the check for additional runouts (like 3 per rev, 4 per rev, etc..) is optional, but if you choose to use it, use positive integers only.
    I don't know how much of this program will make sense and I have wires running everywhere.. so good luck. Keep in mind I'm only a student and I hadn't touched Labview until about 2 or 3 months ago.
    Thanks,
    David Joseph
    Attachments:
    Full example.vi ‏139 KB

  • DeserializeJSON - is there a limit on the size of the JSON data that can be converted?

    I have some valid JSON data that's being converted successfully by DeserializeJSON... until it gets to a certain size, or that's certainly what appears to be happening.  The breaking point seems to be somewhere in the neighborhood of 35,000 characters... about 35KB.  When the conversion fails, it fails with a "JSON parsing failure: Unexpected end of JSON string" message.  And when the conversion fails, the JSON data is deemed to be valid per tools like this one:  http://www.freeformatter.com/json-validator.html.
    So, is there a limit on the size of the JSON data that can be converted by DeserializeJSON?
    Thanks!

    Thanks Carl.
    The JSON is being submitted in its entirety, confirmed by Fiddler.  And it's actually being successfully saved to a SQL Server nvarchar(MAX) field too.  I can validate that saved JSON.
    I'm actually grabbing the JSON to convert directly from the SQL Server, and your comments / thoughts led me down the path of resolution.
    Turns out that the JSON was being truncated prior to getting to the DeserializeJSON command, but it was the cfquery pull that was doing the truncating.  The fix was to enable "long text retrieval (CLOB)" for this datasource in CF Admin.  I'd never run into that before or even knew that this setting existed.
    Thanks again for your comments!

  • Is there a limit on the size of Exchange mailbox?

    One of my user has 8Gb of data stored on the Exchange server mailbox. (Yes, on the server, not local pst)
    Since ipad can store up to 1,000 recent messages, is there is limit on the size of 1,000 emails to be stored locally on the iPad, assuming the user has a 64Gb New iPad?

    no

  • Is there a limit on the size of SDHC card that can be read with the iPad camera connectioin kit?

    Is there a limit on the size of SDHC card that can be read with the iPad camera connection kit?

    I've successfully connected 32 gig SDHC and CF cards so if there is an upper limit, it's at least 32 gig.
    I know SDXC will not work.
    With the cards that don't work, have they been formatted correctly? the camera connection kit will only read cards holding images. (Well, it'll only see the images) And those images have to have a file name of exactly 8 characters (DSC_2342 for example) and they  have to be in a folder named DCIM.
    Anything else it wont' read.
    I put a photo on there called 'Christmas' and the connection kit won't see it. I put a photo on there in the DCIM folder named XMAS2342 it'll see that.
    So it's possible that those cards weren't read because they weren't speaking the right language.

  • Is there a limit to the size of string we can paste in a text area ?

    Hi,
    I have a dialog box and in it a Text Area. When I copy from another page a large amount of data and try to paste it in the text area - it doesn't work.
    I can't paste in it more data but I can type in it manually.
    Is there a limit to the size of string we can paste in a text area ??
    Thanks in advance:)
    maya

    There is no Limit to the string size. Since Swing follows MVC architecture, the data to be pasted is stored in model and the view shows the text to the user. If the text size is large, the screen appears to freeze for the some time while the data is stored in model and painted on the screen.

  • Is there a limit on the size of an array?

    I have two arrays: array1 and array2.
    I have a function that is iterating through array1. If an item in the array matches a particular condition, then that item is pushed onto array2. I can successfully push up to 19 items onto array2. However, if I attempt to push any additional item on top of the 19 items already in the array, then that push fails. Can I not have more than 19 items in an array? Is there any way to increase the size of the array?

    If you are seeing only 20 array elements in the Data Browser it does not mean that the array is limited to 20 elements*. You are perhaps seeing a display restriction in the data browser. Have you checked the actual number of elements in your final array by testing its length property?
    You can change the number of elements displayed in the Data Browser in the ExtendScript Toolkit Preferences on the Debugging page.
    *Note that there are 20 items shown as the first one has an idex of 0.
    Ian

  • How can I limit the vertical size of the plot legend?

    Hi all,
    in my program I use a waveform graph and its plot legend.
    My problem is that the vertical size of the plot legend increases out of my frame and screen if I add too many plots to the graph!!!!
    Is there a possibilty to limit the vertical size of the plot legend and/or to use a vertical scrollbar in the plot legend?
    I use LV 8.2.1 .
    Thanks
    daHans

    You can write to the "Active Plot" property node. The example given in the thread I linked to before shows using this. Did you take a look at that example?
    I'm only suggesting the alternative of an "Active Plot" control if you're trying to give the user the ability to manipulate one of the plots (like color, point style, etc). If you have a lot of plots, and the plot legend is too big, you can provide a numeric control where the user selects the plot, and then additional controls to set the properties for that plot. Not as intuitive as the plot legend, but if that's what you've gotta do, that's what you've gotta do. Attached is a simple example (LabVIEW 8.20).
    Attachments:
    plot.vi ‏21 KB

Maybe you are looking for