• Global community
    • Language:
      • Deutsch
      • English
      • EspaƱol
      • FranƧais
      • PortuguĆŖs
  • ę—„ęœ¬čŖžć‚³ćƒŸćƒ„ćƒ‹ćƒ†ć‚£
    Dedicated community for Japanese speakers
  • ķ•œźµ­ ģ»¤ė®¤ė‹ˆķ‹°
    Dedicated community for Korean speakers
Exit
0

Organizer database JSON files

New Here ,
Sep 16, 2020 Sep 16, 2020

Copy link to clipboard

Copied

Hi all,

 

I am new in the support community however I have been using Element starting from version 6. I am now on the latest version (18) on windows. 

 

I have a fairly large database of around 90K pictures that I have built slowly over the years. I have been using face recognition since the beginning and, in general, I am fairly happy with the result. Over the years I have spent many hours / days / week, cleaning it up and tagging everything. I am now at a point where the number of JSON files in the database is becoming overwelming. I like to copy the catalog on a usb stick when I travel, for laptop use, but the time it takes to perform the copy of 90K files is just way too long. Samething when I do the catalog backup on my NAS. Even zipping it takes foreever. It is becoming really anoying and it is not going to improve as more picture are pouring in. Element is installed on a new Intel 10th Gen 6 cores Computer with plenty of memory.

 

As a work around, I have decided to split my catalog in smaller chunk of about 10K pictures each. So I removed about 80K pictures from catalog to make the first one and clean up the tags and album and the rest. All is fine as far as the Organizer goes. When I go back to the to see how many JSON files remains in the folder, nothing has changed, sitll 90K JSON files! I seems that when pictures are removed from the catalog the JSON folder is not cleanup. The JSON files might be cleanup (many only have {} in them) but are not deleted. I have not found any way to kind of rebuilt the catalog, while retaining the face recognition info. So my efforts to go around the problem were in vain. 

 

I understand that I can get rid of it all and start from scratch but I would like to retain the face analysis work that I have already done, since I spent many, many hours cleaning it. Any suggestion?

 

What would be the impact of deleting all those empty JSON files, I have 35000 of them?

 

Thanks in advance

 

Pierre

TOPICS
Feature request , Organizer

Views

236

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Sep 16, 2020 Sep 16, 2020

Copy link to clipboard

Copied

What would be the impact of deleting all those empty JSON files, I have 35000 of them?

I agree with all of your observations.  I don't know what impact deleteing the empty .json files would have, but maybe you could try it with a sample set.  As you are probably aware, the .json files are created at the time an image is recognized.  Using this information, you may be able to match up the empty files with the approximate dates that the images were imported.  Use the Sort By: Import Batch view to assist. 

My only other suggestion would be to create a keyword hierarchy to match the People hierarchy.  It is a fairly easy job to do this as follows:

  1. Create a new keyword for each person in the People category,
  2. Select a People tag to filter the grid by that person
  3. Select all images in the grid (Ctrl+A)
  4.  Tag the images with the matching keyword tag.
  5. If desired, delete the People tag and/or the .json files.

You will no longer be able to use face recognition for the keyword-tag files, but you can continue to use it for new images imported into the catalog.  You can then easily find and keyword tag the people in your images, using the new People tagged images.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 17, 2020 Sep 17, 2020

Copy link to clipboard

Copied

Hi,

Thank you for your reply.

 

I did try to delete the empty JSON. I did not try to know which one belong to which picture. I just deleted all the empty ones. I could not identify any adverse effect in the organiser. That being said, without more information on the JSON process implemented in the organiser, it is too big a risk. I do not want to be, 6 months down the road with a problem related to the deletion of the empty JSON files. I got too much to loose... In an attempt to get some answerers, I send a question to support. I do not have much hope but in any case, I will share it here.

 

I did tought about the tag thing you suggested. It is a workaround but I will do that in last resort, after I look around for alternative. It's too bad, Adobe spent alot of time, effort and money, to implement and constantly improve face recognition up to a point where it is very good, to kind of nullify it by using a JSON database. So many people are simply not using it.

 

It's a shame, I really don't understand it...

 

Thanks again

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Sep 16, 2020 Sep 16, 2020

Copy link to clipboard

Copied

I can't exactly help you with the delete question. Other that like the previous answer to try it on a test database. 

regarding the remark "even zipping takes forever " ik have a tip. 
I once tinkered some with power shell on my Windows machine. Combined with 7zip command line (the build in archived for power shell is slow). 
You can script (or manually use 7zip) to update a zip file. 
So, create a zip file of your database folder (this takes some time) and after that, use 7zip to update the zip file with the changed files. Finished in minutes, smaller databases even faster. 

And copying a larger zip file is no problem. Windows just can't handle many small files (I really, really do not understand why on earth adobe still uses all those small files and not just one database for this)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 17, 2020 Sep 17, 2020

Copy link to clipboard

Copied

Hi,

 

Thanks for the 7zin trick. I will give it a try.  But 90000 files, its beyond resonable.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Sep 17, 2020 Sep 17, 2020

Copy link to clipboard

Copied

LATEST

Yeah, i know. Let me give you some numbers.

My largest database (i use 5 databases) is  100.198 photos. The database folder is 6,3Gb in data size and has 215.000 (and a bit) files (107k .json and 109k .xmp files).

Needless to say that windows TOTALLY does not like that.

I just ran the powershell script to update the 7zip file, for this largest database the update copy took me 3,3 minutes and the .zip file is currently 4gb. And this 4Gb datafile is just one file an can be backed up quite fast.

And yes, the first time it takes a lot longer to create the zip file. But i think it's totally worth it.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines