I'm in the process og starting LR. I've started importing the wole brunch of existing pictures. These pictures are on a set of NAS servers (7 servers). My first plan is to import all approx 10 TB picture in one catalog. I can see though that A) this will take really really long time (approx 1-2 week if 24-7 work and B) it will produce a preview catalog that is not manageable (a huge amount of files).
My naive thought was to have all pictures in one database and beibg able to search across the whole set of pictures.
So I got to think that perhaps there is another schema for using LR catalogs.
Do any of you have some general considerations or best practices on planning and using LR catalogs?
Why divide up the control of your picture collection? LR works best as one main catalogue managing all your work / workflow, and a single catalogues can record hundreds of thousands of pictures.
My first thought was to have all pictures in one catalog.
So far I've only gone through a minor part of the total amount of pictures. I can see that it will take 1-2 week (24-7) just to import and gererate previews.
To day I tried to backup the physical catalog (.lrcat file plus Previews.lrdata folder). I will end up with approx 24 hour operation just to backup thisobsene amount of sqllite files.
Right now I'm trying out a scheme with zip'ing the physical archive and then store this on the backup.
So my thought was that perhaps LR catalogs are manageable if we're talking 100.000 picture files. But above that - perhaps 10x 20x 50x times as much - things will become complely unmanageable - if you wan't to establish a backup of both catalog and preview data.
Perhaps I end up with my original plan and hope that import, backup etc will be manageable.
You can maybe save yourself a lot of time IMO, by not backing up the large and complex "Preview.lrdata" folder. The Catalog database file, and the image files, are the important part.
If you do not write LR metadata to the files, the Catalog database backup will be the only protection of your editing work; but backing it up does at least protect everything - which writing partial metadata on the files will not have done.
However, if you do write LR metadata out to Raw - or other - files saved in DNG format, then each time you edit something, your next backup has to include all the unchanged image data - because it is all held in the same file. If you have instead used proprietary Raw files, only a small separate XMP sidecar file is written out by LR. That means there are only a few Kb to backup, for each image that has been worked on since the previous time.
There are three set of data
1) The LR catalog (.lrcat file)
2) The Preview data (Xxx-Previews.lrdata folder with an obsene amount of files)
3) The actual picture files (raw files etc)
The catalog (1) is simple to handle. The actual picture files (3) are already in a back routine and this runs perfectly even though it is a huge amount of data.
My problem is the Preview data, it is simply hard to manage. I know that if it gone, it'll just be created again. But with the amount of data I'm working with, this preview generation will take about 1-2 week (24-7). Would be cool to be able to handle this in some sort of backup scheme.
But perhaps a consideration would be that the Catalog (1) hold all your database information and thereby all information that is the base of searching. The picture files (3) of course must be saved, thats the primary concern. But the Preview data (2) can be regenerated - even though it take an huge huge huuuge amount of time.
In fact it's completely pointless to back up the previews.lrdata file. Exclude it from backup.
I'd also not go to the trouble of generating previews. Let Lightroom build them when it needs them, ie when you open a folder and view the images. With so many pictures, there's no point building the previews for everything.
Pretty cool idea. If I just ignore this generationg previews - the worst that can happen is that is that I must wait a bit when I jump into a folder or a collection.
I've been working with Bridge for quite some years but a a newbee with this LR -- really appreciate the advices.
Yes, but dividing your total amount of pictures up into a set of catalogs means that you can't search, male collections across the whole set of pictures.
My physical picture catalog is organised by A) Year and B) Film (takes) and C) image numbers. If I was to divide these into catalogs - it could catalog1999, catalog2000, catalog2001 etc etc
This would means that I could only search/make collection within a single catalog.
film0001 (img 001, img002 ...)
film0001 (img 001, img002 ...)
film00xx (img 001, img002 ...)
film00xx (img 001, img002 ...)
On top of this I've decorated with keywords and some metadata. It works fine in Bridge but with very few pictures and plenty of time :-)
I think it depends on what "it" you want to work faster. It is my understanding that a catalog has to be extremely large (many hundreds of thousands of images) before performance becomes an issue. Since LR does not provide a means to search across catalogs finding images could be much slower with many small catalogues. Even if you felt you had to have multiple catalogs unless they were descrete (that is each image is only in one catalog) you run the risk of loosing control of editing versions.
The common view is that one catalog is perferred to get the most out of LR. There are of course exceptions (like having one catalog per client for some kinds of professional shooters) and such things always depend on ones particular workflow. But if you feel compelled to create multiple catalogs I would ask yourself if your workflow is the best and what sacrifice are you making for what ever you are gaining -- and I would guess that speed is not what you are gaining.
The concept of one catalog is the most logical to me. Combine with keywording and smart use of smart and static collections will work fine.
After looking into it and having this talk/discussion - my main problem with the preview data - isn't really a problem. You can just leave it up to LR to generate as I go along. Browsing a bit in the config dialogs showed that previews do have a finite amount of time anyway. You configure how long time you'll keep them.
So everyone (a lot) says that there's no problem with big databases. And if Preview isn't the concern that I initally thougt - well then my catalog strategy is pretty much clear.
Heavy use of collections
Frequent backup of LR database (using LR self)
Frequent (diff) backup of pictures
Really cool with these forums - and the amount of knowledge available. Hope to give something back at some point :-)
Make sure you take a good look at smart collections. Because they're using a database query, they're much faster than Bridge when you're looking through large numbers of pictures spread across different drives. You can structure the collections more too - so a collection set might contain a mix of dumb and smart collections. But in general you seem to be heading in the right direction.
One of the area I emailed Adobe on in the past is help files which don't
really give enough explanation.
I've read up about smart collections several times now and still can't
grasp what on earth they're all about!
Think of them as saved queries or automatically-updating portfolios. Let's say I have a continuing need to find my best photos of cars, or weddings, or family pictures before a certain date. I just set up a smart collection with an appropriate set of criteria.
Putting one in English, "All my pictures with the Country not UK and Rating 3 or more and with cars in the keywords or Mercedes in the caption field (just in case I forgot to keyword a Merc rally)". So it now lists 80 pictures of cars taken when I'm travelling. Then I go outside the UK again, shoot a few cars and a few more merit 3 stars. The smart collection automatically updates and displays those extra shots. So once it's set up, it's an effortless way to get back to this portfolio of best car pics and keep updating it. That's just one hypothetical example.
I have been exploring this recently, especially as my catelog file is now 1.2Gb and growing. I am not in the least worried about 1.5gb or 5gb or 10gb catelog, but I am concerned about backing this up as I exit Lightroom. It is taking between 5 and 10 mins to complete the standard Lightroom backup. If I am under time pressure… tempting to skip..eventually disaster.
The challenge.... make a few changes to a tiny percentage of my catelog .. (probably a few kb of metadata in the catelog).. but now I have to backup the full catelog.
If it is becoming a usability problem now...then it will be a bigger problem as time goes by (ie the catelog only gets bigger...)
My current personal conclusion are as follows:
Keep all my images in a single catelog. Export current project images to a working Project Catelog. When happy that this project has matured... re-import the Project Catelog to the Master Catelog.
I can make any no of changes in Current project mode.... backing up the catelog happens in a heart beat (relative to backing up the entire Master Catelog every time).
All my images are in a single catelog and I am always aware of current project status.
Current project always has a short list of folders, easier to manage and focus on work in hand... it is amazing the number of times I have clicked through a large hierarchy of folders to get to my current project.
This will work for me… (others milage will vary).
Some interesting facts re my Catelog.
Catelog Size. 1.2 Gb
No. of images Approx 60,000
Metadata Table Size within the catelog. 515 mb
History Table (within the catelog) 492 mb
Largest 75 images of history Approx 253 mb with 40 images consuming 200 mb.
Largest 200 images within the meta data table. Approx 35 mb.
My assumption is that brush strokes are responsible for most of the bloat factor, the price one pays for non- destructive editing.
As far as I am aware...there are no limits... but one at a time.
I suggest you create a few small test catelogs with just a few images in each (raw, jpg, virtual) and get familiar with what you want to achieve on a small scale.