My Lightroom catalog is 8.63GB and is stored in the default location on my Mac (Pictures). The machine I use has flash storage of 500gb of which less than half is used. My OS is 10.11.6, processor 3.4 GHz / Intel Core i7, 32GB Ram, NVIDIA GeForce GTX 680MX 2048 MB.
I have set Lightroom to backup this catalog to a 3TB external HD of which 1.79TB is still available. This drive is connected via USB3.
Over the last year or so the backup process has become increasing slow. I've just timed it and it took about 4 minutes in total. To me this seems quite excessive, am I expecting too much?
I usually find it starts off slow (optimizing) and then rapidly increases in speed. Backups are now compressed so that adds an extra step but uses disk space more efficiently.
I would be happy with 4 mins.
Is your LrCat file really 8.x GB, not including the LrData preview folders?
Working with a file that big takes time. Four minutes is probably reasonable.
Given that my images are organized by date, I would probably archive off some of the older years into a different catalog and delete them from the current catalog so that the main catalog is 3 GB or smaller.
Yes that is the actual size of the Catalogue file. The previews file is 27.84GB in total.
How would you suggest archiving the oldest part of the catalogue? Are there some step-by-step tutorials on this that you'd recommend?
This has been an issue the last few years for me. Backing up, optimizing, copying of the catalog all take + 1hr for my catalog on blazing fast machines (listed below). Why does this happen in 2022? One may say the size of the catalog (50GB) is the bottleneck, but on fast machines with crazy fast drives, there's something in the software hanging this up. It is DISAPPOINTING.
Computer: M1 Pro Max MacBook Pro 64GB of RAM, everything on the internal SSD
50GB for just the catalog file? That is gigantic and not normal. How many images does your catalog contain?
2022 so far is 275,000 or so. My point though is that literally manually copying the catalog files to another drive takes less time than Lightroom's automatic backup, and that's funky to me.
Size matters in this case, but a catalog containing 275,000 images should be nowhere near 50GB. More likely 5GB, which should backup in a few minutes. Or do you mean that the catalog folder is 50GB? Lightroom's backup process is more than just making a copy however. It also verifies the integrity and optimizes the catalog.
the Lrcat file is 54GB.
That is incredibly large for a catalog with that number of images, and certainly the main cause of the problem. Do you have a huge number of history steps for each image? Lots and lots of spot removal in all images because your camera sensor desperately needs cleaning? Anything else that might explain it?
Average history per image is 4 steps (import w/preset, paste settings, adjust temperature, export), no spot removal at all, in any image...
The things that contribute mostly to catalog size are: history and snapshots.
My catalog was 60Gb.
With good history management I was able to bring it to 20-22Gb
It's still big because I have thousands of photos where I still need history.
To figure out why your catlog is big you can run the "sqlite3_analyzer" command on it.
Dowload it here for Mac: https://sqlite.org/download.html
Then on a command line, run the sqlite_analyzer command, providing your LRCAT file as an argument, e,g.
sqlite_analyzer MyCatalog.lrcat > report.txt