I do not consent to content moderation of work that hasn't been deliberately uploaded by users to Adobe servers.
I am passionately against illegal exploitative imagery...but also, at the same time, passionately against an external body that wants to scan work for such imagery. Such actions should be only undertaken with the proper legal framework, in which case it would be fully justified.
We've all seen how limited these algorithmic scanners can be. And given the nature of artistic expression, not to mention issues around intellectual property and the nature of what is considered "offensive" according to social/political trends, there are too many ways for some system to make the wrong judgement on content that will inevitably result in false positives. Not to mention the ability for Adobe to data mine user content...that we seem to currently have too few tools to oversee or limit.
Of course it's understood that Adobe should disallow various imagery on their servers, but because of the above stated limitations of these scanning algorithms, should not be moderating private content that was never intended to be sent to Adobe servers.
The simplest solution would be for users to have access to versions of software without content moderation, and of course without the benefits of generative/neural features.
Again, this argument isn't some gateway to produce exploitative imagery...this is to limit how much external parties can make incorrect judgements on, and have access to, content that can affect users professionally and legally.
[ Revised content inserted by moderator at the request of @DarseZSzabo ]
... View more