Copy link to clipboard
Copied
I had a devil of a time getting my catalog to update with different metadata format.
I was just about to hang myself, when I finally got lucky:
Trick: DO NOT change the version number on any metadata items being altered (or at least not for enum data type when enum values are changing), otherwise the update "errors out" with a flash before the update function ever gets called.
This is the opposite of what I expected. - I expected to have to change the item version numbers.
Does anybody know under what conditions the version number must be changed, and under what conditions the version number must not be changed, and under what conditions changing the version number is optional?
Rob
Copy link to clipboard
Copied
Last summer I read through the relevant section of the SDK Gude a few times, and I must admit I didn't fully grok what I would have to do with schema version numbers and updates. So I've kept my head in the sand -- any clarity we could glean from others would be great.
Copy link to clipboard
Copied
At the risk of being a blind man attempting to lead another blind man, a few observations (recommend taking with a grain of salt):
You can't lower a schema version number - you get one shot at the upgrade function working, after that you have to try the next schema version up.
So, my development scenario was:
1. keep bumping the schema version until the update function works - restore backup catalog if necessary to fallback and retest.
My sense is, although unconfirmed, that the item version numbers when bumped actually create an entirely new entry in the metadata database table. In any case, you can fallback to a previous version provided the metadata definition is compatible (not sure exactly what that means).
I would guess the database id = name + version-num, but now I'm really out on a limb... - maybe someone will pull an SQLite client out and investigate?
My thinking is: if you bump an item version number, then Lightroom will see it and do an auto-conversion assuming thats not being disallowed - even if the schema version is not bumped. However, if auto-conversion is being denied, then the new database entry is not created (or maybe it is, but in any case) the metadata status becomes bad or something, and in any case the conversion function is not called. Thus, you must bump the item version numbers for auto-conversion's sake (if anything in the item definition would be incompatible with present database entry), but you must not bump the item version numbers if you will be handling conversion "manually".
Not sure this makes total sense, but I think would sortof explain what I think I remember happening...
Not sure how accurate any of this is, but in any case illustrates that there are some questions that need answers before anyone can master this aspect of plugin writing. - certainly the guide is sorely inadequate in this regard, IMO.
R
Copy link to clipboard
Copied
Maybe the schema version bump actually triggers an entirely new set of database tables and hence if you change the item version numbers then it makes it impossible to find the old items in the previous schema(?) - when doing a "manual" update.
Maybe this is the deal:
- Update version numbers instead of schema and lightroom will auto-convert items, if possible, and if not possible lightroom flashes an unreadable warning then fails. Hint: set noAutoUpdate to false/nil, or:
- Update schema and leave version numbers as is to create a new dataset with manual transfer. Hint: set noAutoUpdate to true.
Or something like that.
R
Copy link to clipboard
Copied
Excerpt from DevMeta - working:
-- Schema version history:
-- 1 First released schema.
-- 2-6 Test schema: never released.
-- 7 Second released schema: changed 'true'/'false' enums to 'yes'/'no' to work-around bug in enum metadata display.
schemaVersion = 7,
-- the manual update function will forever-more be called instead of auto-updating.
noAutoUpdate = true,
-- When the plug-in is first installed, previousSchemaVersion is nil.
-- This function is pre-wrapped by catalog:withPrivateWriteAccessDo
updateFromEarlierSchemaVersion = function( catalog, previousSchemaVersion, progressScope )
-- Not sure if this check is necessary, but I don't think it will hurt:
if previousSchemaVersion == nil then
return
end
if previousSchemaVersion < 7 then -- generally it will be 1 (or was it 2?) for all users except me.
local pluginId = _PLUGIN.id -- 'com.robcole.lightroom.metadata.DevMeta'
local photosToMigrate = catalog:getAllPhotos()
local total = #photosToMigrate
local function update( photo, name )
local oldValue = photo:getPropertyForPlugin( pluginId, name )
local newValue
if oldValue == 'true' then
newValue = 'yes'
elseif oldValue == 'false' then
newValue = 'no'
end
if newValue then
photo:setPropertyForPlugin( _PLUGIN, name, newValue )
end
end
for i, photo in ipairs( photosToMigrate ) do
update( photo, 'rgbColorMods' )
update( photo, 'toneCurveParamMods' )
update( photo, 'toneCurvePointMods' )
update( photo, 'colorMods' )
update( photo, 'splitToning' )
update( photo, 'perspective' )
update( photo, 'postCropVignette' )
update( photo, 'grain' )
update( photo, 'cropped' )
update( photo, 'retouched' )
update( photo, 'redeye' )
update( photo, 'gradients' )
update( photo, 'brushes' )
progressScope:setPortionComplete( i, total )
end
elseif previousSchemaVersion == 7 then
-- must be a metadata item version trigger
else
error( "can not convert metadata from schema version " .. ( previousSchemaVersion or 'nil' ) )
end
end,
PS - This function takes a really long time to run, but seems to get the job done.
Copy link to clipboard
Copied
From the best I can tell, noAutoUpdate does not do anything at all. My testing results:
i.e. if you have an update function, it will be called, whenever either the schema version changes or a item version changes, regardless of the setting of no-auto-update.
And similarly, if you dont have an update function defined, auto-update will be attempted regardless of the setting of noAutoUpdate.
I dont know if noAutoUpdate is truely a "write-only" variable, or its doing something that my tests have not been able to detect.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now