Copy link to clipboard
Copied
I got an update for ACR (CS6) today and installed it. Suddenly, EVERY time I open Bridge or return to it from another page, it starts thumbnailing my images from scratch. We're talking hundreds of images in this folder. This is new. It did not do this yesterday. What is going on and how do I fix it. If I leave bridge (even if it's still open) and I go back to Bridge, it starts doing the thumbnail extractions all over again and THEN starts on the full size extractions all over again. The result is that all of Photoshop has slowed to a crawl. This is the second time in a month that an update has caused new problems that did not exist before. It's beyond frustrating.
that's cute. I just sat through 10 minutes of full size extractions counting down. it got to zero and STARTED AGAIN. Okay, guys, what's going on? I have 50 gb of images from Asia I need to process. I truly don't have time for this. And the thumbnail extractions just started over again.
Message title was edited by: Brett N
Copy link to clipboard
Copied
And of course, that leaves us with the main question...with so many people affected, why hasn't adobe fixed this???????
Too busy trying to force everyone to the cloud so they can make more money. Very sad. Very annoying. Frankly, bordering on criminal.
Copy link to clipboard
Copied
Read post 334.
Copy link to clipboard
Copied
If all this wasn't so important to my workflow, I would just give up. Anyway, thanks everyone for your ongoing help with this. As I said, I will go to a friend's office next week and see what happens with my Canon RAW files using a PC with Windows 7 and the cloud.
Copy link to clipboard
Copied
Switching back to Bridge does not start regeneration on a file already regenerated after purge.
So rebuilding is going on as I look at folders from last year not opened until after the ACR update. Cache has grown from 11.6G to 12.8.
My system much older (2007 AMD Athlon IIx4 cpu, stock speed) and it is running better than yours on Bridge at least, so it seems.
So it begs the question for Bridge engineers: What is the reference system to use?
Noel Carboni has an inexpensive Win7 optimization publication I use to configure mine. There may be some clues there. No help for Mac.
Copy link to clipboard
Copied
It is very sad to read about all this trouble and I have stayed out for a while because I never have experienced this problem, although I really can imagine how frustrating it is.
I always use DNG and this has the metadata written in to the file itself, so both Sidecar file as Central Database are not needed. There is a small downside (at least for me, for some others it is a big downside) and that is the fact that changes in ACR settings or rating, label, IPTC and renaming causes the complete DNG file - containing both image data and metadata - for an 18 MP file about 20 to 25 MB - to be rewritten (like it is also with a PSD, tiff or jpeg to name a view) instead of the very small XMP sidecar file or central database file which is usually a few KB.
However on my current system (MacPro, 32 GB and PCI slot SSD) I can't say it bothers me at all because saving DNG really is fast. But also my former MacPro (much slower - late 2007 model - with less RAM and normal HD) was not disturbing my workflow that much because it never has been extremely slow.
Yes, It never can get fast enough until it happens real time. Depending on file size and amount it can take a bit of time but it is unacceptable not being able to use Bridge for several minutes or longer when batch rename or updating previews and thumbs after making changes because the spinning wheel on a Mac shows and freezes Bridge. When having hundreds or even thousands of files it sure takes a bit more time.
And don't forget, if you only think of the days we renamed them manual (mostly out of ignorance) or added IPTC by copy paste to each image then you really know what slow means...
Some observations on a few posts:
Yammer P:
And you can just copy that to a new Mac? Does that work?
I can confirm this, I also have a back up of my most important user library folders with custom settings, you can drag and drop them to replace without a problem, only a restart of the application is needed to reactivate the replaced settings.
Sorry, I'm skeptical. Have you ever tried it?
Now you sound like a Mac User when the subject is something being easy on Windows…
![]()
I have noticed too that, if you visit an uncached folder, thumbnails are rebuilt giving the currently visible files a priority. If you drag the scroll bar down to another group of files, they are then given priority. If you click on a file, it takes priority (as does the image either side of it).
This is as designed and has been the case for as long as I can remember, It should do so because it gives you the option to give priority to most needed files when in a hurry.
CameraAnn:
I am inclined to use enormous Folders (over 7,000 images in a single one covering a particular expedition until I eventually get around to refining the whole lot down into smaller folders) so to have Thumbnails rebuilding themselves everytime that I access one of these larger folders would be intolerable.
I have to confess that every now and then recaching is needed but for the greatest part of my experience using Bridge to move around files to other folders using the folder panel it respects the prior made caching.
Never checked if the actual cached moved form it's original folder in the Cache or if there is a small extra file created with new location data, it simply works as expected (except sometimes, mostly when being to fast with moving around). Also copy to an other disk still respects the prior build cache and HQ previews without the need of rebuilding the cache again.
Hudechrome:
So rebuilding is going on as I look at folders from last year not opened until after the ACR update. Cache has grown from 11.6G to 12.8.
I usually let it grow to around 40 or 50 GB before dumping the lot manually and start over again with a fresh cache.
Long time ago I stopped using Bridge for archiving purposes because the ever changing cache and its problems made long time DAM useless. When you have set the option in prefs to generate monitor sized previews then depending on you screen size the cache files can add up very quickly.
I have a 30" screen and the monitor sized previews in the 1024 cache folder are in the range of 1 to 1,5 MB each
And keeping 100% previews saved generates files form around 10 MB.
So it is very hard to compare each user, cache sizes depend on original image size (MP), jpeg or Raw, generating monitor sized previews and keeping 100% previews.
Copy link to clipboard
Copied
For me, no problem. We know, priority is given to the files sought during rebuild. I tried to generate Yammer's problem with Collections. It's a bit slow to rebuild them but once done it's good to go.
So yes, I see what they see but it seems not to affect my work unduly, nor is it different with the current ACR version.
There are good reasons to prefer sidecar, and as you found out, DNG takes more room. The only dng's I use are those created with DxO and the difference in file size is 4x, that is a 9M nef generates a 36M dng.
You can build a much larger file base including room for metadata just staying with nef (for Nikon, of course).
So not only the question about what is the reference system for collecting data and redoing the software, what is the tolerance limit for less than perfect performance?
And what would perfect look like? Maybe generate all cache files automatically over night? Or under users control?
But then, all I have to do to put all this in perspective is to recall the intensity of work in the darkroom just for the finishing part!
Copy link to clipboard
Copied
I've been looking at the Cache and have a question: There is a folder labelled "full". It's slighly smaller than the 1024 folder, being 5 G vs 6+G for 1024.
Why is it there?
Copy link to clipboard
Copied
The <full> cache folder contains cache files of 100 percent previews. It likely contains fewer folders and files than the <1024> cache folder. Bridge may create 100% preview cache files when generating actual pixel previews for its loupe and when clicking on a preview to zoom in. These 100% previews will be retained in the <full> cache folder when "Keep 100% Previews in Cache" is checked on in Bridge Cache preferences.
Copy link to clipboard
Copied
Thanks Robert,
I assume then, if I empty that folder that Bridge will re-create those 100% views if I want them again. It's 5G whereas the 1024 folder is a bit over 6 G.
Copy link to clipboard
Copied
Larry, I was unfortunate enough to lose an external FW drive (Seagate, btw) with a bit over 35 GB worth of image files late last year, and human error had led to both backups to be overwritten by some totally different images by the wrong copy script just the same night. I was left with only the 1024 folder.
Even the most expensive data recovery service in Livermore couldn't recover a single file due to extensive damage to the platters.
I made sure right after that that the "Keep 100% Previews in Cache" is checked on in Bridge Cache preferences at all times, but I sure hope I never have that extremely rare coincidence ever happen again.
Copy link to clipboard
Copied
Hudechrome wrote:
Thanks Robert,
I assume then, if I empty that folder that Bridge will re-create those 100% views if I want them again. ...
Yes. That has been my experience. You might erase most but not all folders, retaining ones for images that you are currently working with or might anticipate wanting to view 100% in the near term.
Copy link to clipboard
Copied
There are good reasons to prefer sidecar, and as you found out, DNG takes more room. The only dng's I use are those created with DxO and the difference in file size is 4x, that is a 9M nef generates a 36M dng.
With todays prices of storage I don't bother that much with space anymore (being a professional with commercial revenues does help a bit in this case but still, Disk space and RAM are getting larger and cheaper at the same time
)
You can build a much larger file base including room for metadata just staying with nef (for Nikon, of course).
I use Canon but the CR2 files also use XMP side car.
So not only the question about what is the reference system for collecting data and redoing the software, what is the tolerance limit for less than perfect performance?
And what would perfect look like? Maybe generate all cache files automatically over night? Or under users control?
This is very workflow based and then the old saying 'each photographer uses his or hers own workflow'
Personally I never bother about prior settings, but I also have to say I do the lot of the post processing in PS itself using actions with adjustment layers and masks based on channels (Check Guy Gowan for more info, he has some free tutorials but his paid tutorials are really excellent, I learned a lot from his ideas).
All results are primarily based on the photographers taste and style. That said, the mood is also an important factor and how long have you been staring at the screen. How many times i thought I had something wonderful and an hour or day later it seemed like rubbish, and also vice versa.
Hence I never bother about prior ACR settings. I start with it and complete the end result in PS, finish with IPTC, renaming etc in Bridge and from there distribute the files.
I often need older files from my archive and when I see those files that I then thought to be good are really bad compared to todays techniques and other improvements. So I have to start again from scratch anyway ![]()
But then, all I have to do to put all this in perspective is to recall the intensity of work in the darkroom just for the finishing part!
While very glad I was part of both analog and digital photography I really would not like to go through that old process again ![]()
And not to mention that cmd + Z was not possible, you only option to redo was dump the result in the bin and start over with a fresh paper, waiting at least 1 minute to see a glance of the result and even more then 10 minutes to judge the dried result under good light, we really are spoiled with the digital era!!
Copy link to clipboard
Copied
and human error had led to both backups to be overwritten by some totally different images by the wrong copy script just the same night.
That is a real horror scenario and sad to hear. Human error is still the greatest danger and we all have been there much to often. I try to minimize errors with a second BU stored externally that never gets copied in the same go as the first BU. And also some build in safety settings that warn if more then 50 % of the data gets deleted, combined with several settings to first archive deleted files from particular folders. Nevertheless, we humans are still the weakest links when it comes to Back Up ![]()
Copy link to clipboard
Copied
Copy link to clipboard
Copied
Really sad, Ramon. 32G loss for me, especially depending on what set were lost would be devastating. So I back up in various ways. I also use a script and it is easy to mess it up because I have it set to backup the current year which opens in the Edit window. It's a neat .bat file as it looks at what changed from the last backup and only updates Edit. The trick is to be sure that I did the changes right at the first of each year.
That being said, I also generate a print file, which gets supplemented manually, on 3 drives.
Copy link to clipboard
Copied
I also do much work in PS after ACR. However, I have certain Presets in ACR which allow me to generate dramatic b&w images with a single click. The difference between doing it there and doing it in PS is huge. So if I need to consider a b&w conversion in PS, I dupe the nef and leave it at the Default settings in ACR. Or a Snapshot but I prefer to dupe it.
I have 6 drives available in the case, with several externals locked away. One internal drive is on a hot pluggable tray so I can run as many drives as I want to which I can point using the .bat.
Sometimes I still feel like a newbie although I have been at this since PS5.5, and photography in general since...since Columbus was a midshipman it seems!
I also have an electronics engineering profession running parallel to photography with both digital and analog paths.
Copy link to clipboard
Copied
Nothing here.
Copy link to clipboard
Copied
Hmmm, I have but 29 folders in "full" and it is not adding any more. I purged one folder, have 100% checked, exited and reopened Bridge allowed the folder to rebuild and no 100% from that folder. I even opened a file, watched it go through a cleanup process, but still the last folder in "full" was April 9.
With only 29 folders occupying 5G, maybe it's not a good idea to continue without pulling C drive and installing a bigger one.
The process appears capricious.
Copy link to clipboard
Copied
Huderchrome,
In edit/prefernces/cache do you have "keep 100% previews in cache" checked?
Copy link to clipboard
Copied
Yep.
Copy link to clipboard
Copied
After watching what has been said pro and con about changing preferences to store metadata in the database rather than sidecar files I am even more reluctant to do this. However, I notice when Bridge is thumbnailing, it goes first through thumbnail extractions, then full size extractions. It's these full size extractions that take so long. And if the program is continually regenerating, perhaps if full size extractions are no longer involved, maybe it will work more quickly. I looked in preferences and couldn't find a way to stop the full size extractions. Is this possible? And is it even desired to speed up the process. If so, what will be the difference in what I see.
I do remember before I reverted to 7.2 that after cycling through the regeneration a few times, the program would stop. Of course, the moment I opened a new folder or left Bridge, it started again. However if I am working for a while in this folder, perhaps this will be acceptable.
Copy link to clipboard
Copied
Full size and 100% are the same (check the resolution. The 100% files from the D7100 are 6000x4000, the same as raw.)
Copy link to clipboard
Copied
ycardozo wrote:
.... I looked in preferences and couldn't find a way to stop the full size extractions. Is this possible?
So far as I have seen,100% (full-size) extractions to cache will occur when 1) Generate 100% previews is checked on, 2) after using loupe or zooming in when "Keep 100% Previews in Cache' has been checked on in Cache preferences, and 3) for files where cache is being regenerated and 100% previews already exist - even if the generate 100% options in (1) and (2) are unchecked. To eliminate regeneration of existing 100% files in cache you would need to purge them from the cache - purge cache for selection or folder, or manually erase the files or folder(s) in the <full> directory.
Copy link to clipboard
Copied
Agreed about the purge. I did save the file to a thumb drive, just in case. Gave me my 5G back!
Copy link to clipboard
Copied
I made sure right after that that the "Keep 100% Previews in Cache" is checked on in Bridge Cache preferences at all times, but I sure hope I never have that extremely rare coincidence ever happen again.
You better not rely on the full size cache files for Back Up purposes. While the files are indeed at the same pixels as the original the quality for savings are not high. The saved 100 % image in the cache folder Full is about half the size in MB compared to open in PS and save as max (12) quality jpeg.
I just used the same DNG file from my Canon 1Dx (18MP - 5148 x 3456 pixels) for a small experiment.
- Generate a 100% preview resulting in a file of 6,5 MB in the Bridge Cache / full folder
- Open the same file in ACR (without changing anything) and save as jpeg Max (12) resulting in a file with size 12,8 MB
- Then open the same file in PS and again without doing anything save also as jpeg 12, result (strangely enough) in a bit smaller size 12.7 MB (only about 20.000 bytes difference between the both)
Yet there are two major differences between full size saved previews and normal saved files that makes it not very reliable for back up purpose:
- 1: the full size preview strips all the metadata of the files.
- 2: the full size preview has sharpness applied. Which makes sense for you normally would have set sharpness applied only to the preview because sharpness should be applied at the end of the process (yes, there are millions of theories about sharpness, for me sharpness in Raw other then preview is a no go).
So investing in an extra HD and good Back Up software seems a better option ![]()
I use Synchronize! for Back Up, It has a basic version (30 $) and a pro version (100 $) and options for a 2 year renewal for updates and upgrades.
I prefer the pro version for several reasons but main reason is that you can use the option to archive files from certain folders that have been deleted or overwritten by a back up session, to minimize human error. I believe both are capable to warn if more then 50 % (customizable) is deleted in one session but the Pro edition has this option. And this would have prevented the prior script action because instead of overwrite all data a warning message pop's up and you have to manual confirm this before proceeding the back up.
100 $ is for the vast majority (including me) still a serious amount of money but compared to the loss of 35 GB of valuable data it really is peanuts…
Here is a link to compare both options:
http://www.qdea.com/synchronize_x_plus_compare.html
Find more inspiration, events, and resources on the new Adobe Community
Explore Now