Skip to main content
Participant
August 26, 2022
Open for Voting

Nudity and other issues which appear to violate Adobe Generative AI Guidelines [merged thread]

  • August 26, 2022
  • 186 Antworten
  • 121832 Ansichten

Hello Adobe and its collective users

I am writing to you not only as a devoted user of Adobe’s suite of creative tools but also as a professional photographer whose work has been recognized and displayed in museum settings. My specialization in classic nudes has allowed me to explore the human form in a manner that celebrates beauty, form, and artistic expression. However, I have encountered a significant challenge with the AI restrictions placed on editing images that contain nudity, even when such images are created within a professional, artistic context.

 

As an artist whose work often involves nuanced and sensitive subjects, I understand and respect the complexities of creating ethical AI tools that serve a wide user base. However, the current limitations significantly impact my creative process and professional workflow, particularly when it comes to editing backgrounds for nude or semi-nude images. These restrictions not only prolong my work but also inhibit my artistic expression, compelling me to seek alternative solutions that may not offer the same level of quality and integration as Adobe’s products.

 

I propose the consideration of the following points, which I believe could benefit both Adobe and its professional users:

 

Artistic Integrity and Professional Use: Recognition of the professional and artistic context in which tools are used can help differentiate between content that is genuinely creative and that which the restrictions aim to prevent.

 

Ethical Use Policy: An ethical use policy that accommodates professional artists and photographers, possibly through a verification process, ensuring that our work is not unduly censored while maintaining legal and ethical standards.

 

Custom Solutions for Professionals: The development of specialized software versions that allow more flexibility for editing sensitive content, with appropriate safeguards to prevent misuse.

 

Feedback and Advisory Panel: Establishing a panel of professionals from the art and photography community to provide ongoing feedback and insights on how Adobe’s tools can better serve creative professionals.

 

Transparent Guidelines: The creation of clear, transparent guidelines that navigate the legal and ethical landscape, especially regarding sensitive content, to ensure users can understand and comply with Adobe’s policies.

 

I am fully committed to engaging in a constructive dialogue and am willing to be part of a solution that respects both the creative needs of artists and the ethical considerations of digital content. I believe that by working together, we can find a balanced approach that supports artistic expression while adhering to shared values and responsibilities.

 

Thank you for considering my perspective on this matter. I am hopeful for an opportunity to discuss this further and explore how we can make Adobe’s tools even more inclusive and accommodating for professional artists and photographers.    Steven Williams 

186 Antworten

Participating Frequently
February 15, 2025

Considering it is Adobe AI and one of the industry's leaders, it shows how poorly they have trained the AI if it can't differentiate between red marks on a shirt or blood. Or they have the filters cranked up way beyond what they have listed in the TOS.

Inspiring
February 13, 2025

Indeed !

Inspiring
February 13, 2025

Well said.

 

D Fosse
Community Expert
Community Expert
February 13, 2025

Very good, @rayek.elfin 🙂 I'd sign off on that.

 

I normally don't get into these dicussions, but now that I'm here, I'd like to specifically point out one thing: what kids do to each other on social media. That alone should make it obvious that you can't simply let this loose.

rayek.elfin
Legend
February 13, 2025

@DdavidChung

 

First, I actually am in favour of full freedom of information and art. If people here want to use genAI to create nudes in their art, then it is simple enough to install open source tools that are available for free and do not censor the output. Get a good Nvidia GPU, and you have full creative control!

 

Secondly, it is impossible to separate the tools from the actors and vice versa. It is well researched and documented that a tool will influence the actor's behaviour, thoughts, intentions, and interactions in the external world. It is an extremely naive worldview to think otherwise. Context is everything. And everything is connected, nothing is truly separable.

 

What is happening in the world right now only proves that the average human being still needs supervision. Earlier in this thread someone mentioned that only children need to be watched over. Obviously many adult humans require the constant threat and supervision of the law and societies policing to behave "nicely". And unfortunately this works on both micro and macro levels: give a country nuclear weapons and their behaviour changes dramatically on the world stage.

 

If it were true that human adults require no supervision we wouldn't still have wars, strife, and untold human and non-human suffering all around our small and fragile planet. We would work together instead and nurture every single human life and non-human life. This, unfortunately, is not the case. 

 

So while I hope for a better future in which humans can be allowed full freedom, reality tells us otherwise at this point in time. Controls and supervision need to be in place. GenAI's threshold for destructive purposes is too low. The proverbial monkey could generate "art" with it.

 

Now, I do agree that it would be possible for Adobe to put firewalls and paywalls up for those who want to use its genAI tools to create nude art. It is however a cost-benefit calculation, and it seems Adobe management decided against potential litigation and a large extra financial overhead to maintain these.

 

Anyway, the tools are out there to do what you @DdavidChung want to do. Redirect your attention elsewhere instead of butting heads with a corporate wall that will not and cannot budge (for good reasons).

Participant
February 13, 2025

@rayek.elfin 

No, your argument is blurring the line between tools and actors, deliberately attributing a person's crime to the tool.

Unless a tool is totally, specifically designed for committing crimes, the responsibility for a crime always lies with the person who misuses it, not the tool itself.

Yes, some tools are indeed more prone to misuse. In such cases, the proper approach should be implementing more thorough user qualification reviews, rather than restrict all users to a limited version.

Regarding the claim that creating nude art is illegal in certain countries, the proper approach should be nationality verification - implementing restrictions only for users from countries where such content is illegal. The solution should not be imposing a global restriction just because it is illegal in one country. Otherwise, basic tools like Photoshop or even a painter’s brush should also be restricted, which would be an unreasonable double standard.

You don’t need to defend Adobe. Based on your argument, the best approach for Adobe should be to establish different user tiers, for example:

  • Limited - For those countries does not allow nudity or something.
  • General - Allow most uses, except some contents with age limitation or obvious illegal content.
  • Advanced - Requires verification of age, nationality, and allows the generation of any content except obvious illegal content, under stricter usage commitments (to legally exempt Adobe from liability).
  • Pro - for the users with artistic technical skills, under strictest usage commitments and can generation almost anything.

It is obvious that Adobe has taken the lazy route by making everyone use the limited version with a high-rate-misjudgment moderation, without considering age, nationality, or with/without any artistic technical skills. Adobe is the one who doesn't consider the context.

Of course, this isn't called a mistake - it's Adobe business decision.

Dario de Judicibus
Known Participant
February 13, 2025
I fully support and wholeheartedly agree with this: Limited, General,
Advance, Pro.

Dario de Judicibus
rayek.elfin
Legend
February 13, 2025
quote

Making counterfeit money is illegal.

 

Making nude art is not.

 

... Using nudes to extort, bully, emotionally intimidate, abuse others is, however.

 

GenAI is very, VERY good at that.

 

So using money in art is legal. Making counterfeit money is illegal.

 

Making nude art is legal in some countries, and illegal in other countries. Using nude art to extort, etc. others is illegal.

 

See? Not so simple. Nothing is ever black-and-white. It depends on the context.

 

If I were Adobe I'd think twice before allowing full "creative freedom" and enabling ANYONE --without any artistic technical skills-- to generate photo-realistic offensive and illegal imagery within seconds anywhere in the world.

 

It's all about context.

Participating Frequently
February 12, 2025

Making counterfeit money is illegal. 

 

Making nude art is not.

 

End of discussion.

 

Unless something has changed in which case a lot of art galleries and museums are going to have to be getting rid of a whole lot of their art.

 

Perhaps they can replace it all with pictures of puppies and kittens since that’s all Photoshop appears to be any good for these days.

Participating Frequently
February 12, 2025

And there’s the problem. Adobe isn’t concerned about people making offensive issues they are concerned that their poorly programmed, loose-cannon software will create a pornographic image when a kid is trying to change a picture of a bug.

 

So their solution seems to be a second nanny-type program that reviews the images produced before showing them to the creator and stopping them from being shown just in case the request wasn’t for something pornographic. This is why it gets so far into the generation process before saying no.

 

That is not the solution.

 

If more accurate programming isn’t possible at this time the most logical solution is to allow control if that nanny program by the user along with a disclaimer that by turning it off the user understands that it’s a new type of software that still has some control issues and random unexpected potentially very offensive results are possible.

 

Alternatively if the generation software can’t be improved, then improve the nanny filter so that it can be set to different levels of acceptability (like Google image search) so that on “safe” setting doesn’t show images at all (as it currently does for everyone), on “moderate” setting gives a warning of violence or nudity that requires 18+ approval before showing or blurs out anything potentially offensive with approval required to unblur and then on “low” setting shows everything except perhaps anything that is truly illegal.

 

This would be very easy to do properly. It exists for just about every image search program out there.

 

Come on Adobe, you represented a giant leap forward in art and art production when you came on the scene decades ago (I was around for it), now you represent a giant leap backward.

 

You're better than this. If it's a software reliability problem, let people know, they'll understand, don't hide behind this censorship [cursing removed].

daniellei4510
Community Expert
Community Expert
February 11, 2025

I had no problems with it myself when attenpting gen expand. Well...yes, the first two tries resulted in a violation notice. Then I moved the image to the left about a quarter of an inch and it worked. Then I just attempted it again, as is, without moving it, and it worked. In this case, at least, it's a bug, not a censorship issue. P.S.: the results were horrid. Weird lamps, a toilet that could rival Dali, etc.

Adobe Community Expert | If you can't fix it, hide it; if you can't hide it, delete it.