Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
86

Nudity and Semi nudity using AI and its imposed restrictions.

Community Beginner ,
Feb 26, 2024 Feb 26, 2024

Copy link to clipboard

Copied

Hello Adobe and its collective users

I am writing to you not only as a devoted user of Adobe’s suite of creative tools but also as a professional photographer whose work has been recognized and displayed in museum settings. My specialization in classic nudes has allowed me to explore the human form in a manner that celebrates beauty, form, and artistic expression. However, I have encountered a significant challenge with the AI restrictions placed on editing images that contain nudity, even when such images are created within a professional, artistic context.

 

As an artist whose work often involves nuanced and sensitive subjects, I understand and respect the complexities of creating ethical AI tools that serve a wide user base. However, the current limitations significantly impact my creative process and professional workflow, particularly when it comes to editing backgrounds for nude or semi-nude images. These restrictions not only prolong my work but also inhibit my artistic expression, compelling me to seek alternative solutions that may not offer the same level of quality and integration as Adobe’s products.

 

I propose the consideration of the following points, which I believe could benefit both Adobe and its professional users:

 

Artistic Integrity and Professional Use: Recognition of the professional and artistic context in which tools are used can help differentiate between content that is genuinely creative and that which the restrictions aim to prevent.

 

Ethical Use Policy: An ethical use policy that accommodates professional artists and photographers, possibly through a verification process, ensuring that our work is not unduly censored while maintaining legal and ethical standards.

 

Custom Solutions for Professionals: The development of specialized software versions that allow more flexibility for editing sensitive content, with appropriate safeguards to prevent misuse.

 

Feedback and Advisory Panel: Establishing a panel of professionals from the art and photography community to provide ongoing feedback and insights on how Adobe’s tools can better serve creative professionals.

 

Transparent Guidelines: The creation of clear, transparent guidelines that navigate the legal and ethical landscape, especially regarding sensitive content, to ensure users can understand and comply with Adobe’s policies.

 

I am fully committed to engaging in a constructive dialogue and am willing to be part of a solution that respects both the creative needs of artists and the ethical considerations of digital content. I believe that by working together, we can find a balanced approach that supports artistic expression while adhering to shared values and responsibilities.

 

Thank you for considering my perspective on this matter. I am hopeful for an opportunity to discuss this further and explore how we can make Adobe’s tools even more inclusive and accommodating for professional artists and photographers.    Steven Williams 

Idea No status

Views

72.0K
Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 2 Correct answers

Community Expert , Jun 12, 2024 Jun 12, 2024

@Dalvidos Similar requests have been made and each time users are referred back to the terms of use outlined by Adobe.

https://www.adobe.com/legal/licenses-terms/adobe-gen-ai-user-guidelines.html

 

Votes

Translate
Community Expert , Jun 04, 2024 Jun 04, 2024

Adobe is widely used in educational and business settings. They've made a choice to prevent misuse/abuse and train on licensed models to prevent liability.

If you are working with nudity - there are ways around existing models in Photoshop -

  1. Duplicate the layer. Hide the original Layer.
  2. Paint over the "offensive" areas covering up any triggered items. 
  3. Select and generate.
  4. Turn off the painted layer once you have your generation.

If you are trying to generate nudity - you're better off looking

...

Votes

Translate
replies 274 Replies 274
274 Comments
Explorer ,
Feb 07, 2025 Feb 07, 2025

Copy link to clipboard

Copied

This is the best response to adobe's overreaching censorship I have read..
Thank you sir.

Votes

Translate

Report

Report
Participant ,
Feb 08, 2025 Feb 08, 2025

Copy link to clipboard

Copied

IMG_0624_Q4H1zo.jpgexpand image

 

Photoshop informed me this file is too explicit and violates their moral and ethical guidelines and Generative AI cannot be used to expand the background so I can crop it slightly wider. Am I a deviant? That's essentially what Adobe is saying.

I realize that American corporations are being ordered to excise all women's rights and to adopt a more "Judeo-Christian" focus. Do we know if Adobe is part of this anti-inclusivity movement to crush the rights of minorities and perhaps only allow images of men of specific ethnicities to be edited with their products?

Votes

Translate

Report

Report
Community Expert ,
Feb 08, 2025 Feb 08, 2025

Copy link to clipboard

Copied

@QINGCHARLES the stains on the shirt probably look like blood to an artificial intellegence. 

Votes

Translate

Report

Report
Advocate ,
Feb 08, 2025 Feb 08, 2025

Copy link to clipboard

Copied

"You can see as many nudes in art as you want, both male and female, just by
walking around the squares of Italy." - mostly males ... :O)

Votes

Translate

Report

Report
Community Expert ,
Feb 08, 2025 Feb 08, 2025

Copy link to clipboard

Copied

In Europe ...
We do not
have the same aversion to nudity that exists in the USA.
Here, we don't even have a law
preventing work with images of banknotes 

 

Well, that is only partly true. If at all.

 

As well as in the USA we have huge problems with bullying and using images of nude bodies for that purpose.

And of course there are laws against forgery, I mean, come on ...

 

Please do not deny the obvious.

Votes

Translate

Report

Report
Community Beginner ,
Feb 08, 2025 Feb 08, 2025

Copy link to clipboard

Copied

"I get around it in Photoshop itself. It is fairly easily tricked. As long as it can't see a feminine-looking person in the photo, it'll let you do background removal, canvas extend etc."

The problem is that what Photoshop does, and what AI was supposed to do, is speed up workflow. There are ways to get around it. It will not let you use "Satan" (but it will let you use Jesus and Buddha, which to me seems like religious censorship as well), but you can tinker around for 30 minutes experimenting with prompts and make a red skinned demon with horns, a tail, and cloven hooves carrying a pitchfork.

The problem is three-fold. First, it is not speeding up my workflow, it is slowing it down. Second, there are countless examples of what Americans legally and philosophically consider protected speech, and art is a form of speech, where "Satan" is a desirable image. Political cartoons regularly use the image to make commentary on things perceived as "evil", whether they are saying the thing is being, literally, demonized, or if they are saying the thing really is very bad. Cartoonists often use Photoshop, but the tools they can use now are limited, and political art or illustrated commentary on everything from drug use and incarcaration to violence, war, school shootings, and nuclear proliferation.

What they are doing is not going to solve or reduce the problems they are targeting. Pornographers are not going to make less pornography and people exploiting children are not going to exploit fewer children. At the same time, if Adobe cut out the censorship it would not increase the exploitation of children, violence, or pornography. There are too many alternatives out there that are much cheaper and designed for what they want. Child pornography creators are not banging their heads against the wall saying "Adobe won't let me do this and it is completely destroying my workflow." There are tons of legitimate artists that are saying exactly that however.

Even from a moral standpoint, preventing people from creating art and imagry that informs the public about horrible things and helps put  those horrible things into a language people across the world can relate to is at least as harmful, and likely more so, than not preventing your product from being used to create images of horrible things, or horrible things themselves (like child pornography). There is also the problem that people from many countries, the US (Where Adobe is based) one of the primary among them, consider the restriction of speech and expression to be much, much more offensive than virtually everything they are banning, with the exception of child pornography. There is a much greater backlash from people trying to prevent the censoring of nude images than there is from people trying to ban nude images, but Adobe is pandering to the minority group of largely religious fanatics that view any depiction of the human form as a sin while completely ignoring the majority group that prioritizes freedom of expression. When you look at the actual users of Adobe products the number of people offended by things such as nudity is going to drop to almost nothing while the number of people offended by censorship is going to vastly increase. In trying to be politically correct and inoffensive they are directly offending a large proportion of their users on both political and moral grounds. Telling an artist what is and is not OK to create is often as offensive to them as an image of Muhammad is to most Muslims, and certainly more offensive than an image of Satan is to the vast majority of Christians. Christians themselves create art featuring Satan.

This whole thing is just misguided politically correct insanity that cannot address the problems they are trying to limit while creating serious problems for their actual customers. It just proves that Adobe is not run by people that relate to its users, it is run by businessmen and lawyers that are completely out of touch with their customers. In the end, this is only going to hurt them. I personally feel dirty using a tool that promotes artistic censorship, and am seriously looking into what alternatives I can use that are not morally repugnant to me.

Votes

Translate

Report

Report
Community Beginner ,
Feb 08, 2025 Feb 08, 2025

Copy link to clipboard

Copied

"it seems you might be confusing »AI-based image generation« and »art"

And it seems that you are confusing your definition of art with free speech and expression. First, not every use of AI image generation is to generate a final result. I often use it to generate specific elements to be used in something larger. This might be an element in a collage, an element in a larger image, or, most usually, a replacement element for a photo, whether it is an eye, a part of the background, or the pendant on a neclace. If there are a bunch of distracting trucks in the background of a portrait that you would otherwise consider "art," would using AI to get rid of them instead of using clone and stamp reduce "art" status of the portrait?

Adobe will analize my image to determine if it is "offensive" before allowing me to generate things for the image. This is bad in itself, but even in application it just plain gets wrong what it is looking at. There are countless issues where people in bathing suits, especially if it is a flesh colored bathing suit, are tagged as nudity and Adobe AI tools are not allowed to be used on them.

The Adobe censorship issue goes way beyond creating images from scratch and using them as your final product. It will not let actual artists to take pictures of nudes to use AI to expand their background. I could not use AI tools in a pro-breast feeding pamphlet. If I want to make an anti-violence work I can be restricted not only by the content of the image but I cannot use the prompt "gun" or "blood" because Adobe assumes that any use of guns or blood in images is going to be used to promote or glorify violence or self harm.

As far as "art" goes, not all speech and communication needs to be art. I cannot use Adobe to generate blood for an informational PDF outlining a surgical procedure. In instances such as this it does not matter if the imagry is original, what matters is the information being conveyed. Much of graphic design is not intended to be art, it is intended to convey a message, and graphic designers are being disuaded from using Adobe AI in their projects.

Everyone is worried about AI taking away jobs from graphic designers, but Adobe is actively limiting the tools graphic designers can use to keep up with AI.

They are just wrong on this issue on every level except possibly intent. They do not want to get sued and they do not want to be associated with illegal or "offensive" images. Finding a way to limit illegal images is understandable, but "offensive" is subjective and directly conflicts with the morality of the vast majority of it's users. First, art is often going to be offensive, speech is often going to be offensive, and even accurate information is going to often be offensive. If Adobe wants to make sure that it is never associated with anything offensive than it needs to get out of the art, imaging, and graphic design business and go into something that is not, by its nature, going to sometimes be offensive.... like rescuing puppies.

Votes

Translate

Report

Report
Community Beginner ,
Feb 09, 2025 Feb 09, 2025

Copy link to clipboard

Copied

Exactly!!!

Votes

Translate

Report

Report
Community Expert ,
Feb 10, 2025 Feb 10, 2025

Copy link to clipboard

Copied

I had no problems with it myself when attenpting gen expand. Well...yes, the first two tries resulted in a violation notice. Then I moved the image to the left about a quarter of an inch and it worked. Then I just attempted it again, as is, without moving it, and it worked. In this case, at least, it's a bug, not a censorship issue. P.S.: the results were horrid. Weird lamps, a toilet that could rival Dali, etc.


daniellei4510 | Community Forum Volunteer
---------------------------------------------------------
I am my cat's emotional support animal.

Votes

Translate

Report

Report
Community Beginner ,
Feb 12, 2025 Feb 12, 2025

Copy link to clipboard

Copied

And there’s the problem. Adobe isn’t concerned about people making offensive issues they are concerned that their poorly programmed, loose-cannon software will create a pornographic image when a kid is trying to change a picture of a bug.

 

So their solution seems to be a second nanny-type program that reviews the images produced before showing them to the creator and stopping them from being shown just in case the request wasn’t for something pornographic. This is why it gets so far into the generation process before saying no.

 

That is not the solution.

 

If more accurate programming isn’t possible at this time the most logical solution is to allow control if that nanny program by the user along with a disclaimer that by turning it off the user understands that it’s a new type of software that still has some control issues and random unexpected potentially very offensive results are possible.

 

Alternatively if the generation software can’t be improved, then improve the nanny filter so that it can be set to different levels of acceptability (like Google image search) so that on “safe” setting doesn’t show images at all (as it currently does for everyone), on “moderate” setting gives a warning of violence or nudity that requires 18+ approval before showing or blurs out anything potentially offensive with approval required to unblur and then on “low” setting shows everything except perhaps anything that is truly illegal.

 

This would be very easy to do properly. It exists for just about every image search program out there.

 

Come on Adobe, you represented a giant leap forward in art and art production when you came on the scene decades ago (I was around for it), now you represent a giant leap backward.

 

You're better than this. If it's a software reliability problem, let people know, they'll understand, don't hide behind this censorship [cursing removed].

Votes

Translate

Report

Report
Community Beginner ,
Feb 12, 2025 Feb 12, 2025

Copy link to clipboard

Copied

Making counterfeit money is illegal. 

 

Making nude art is not.

 

End of discussion.

 

Unless something has changed in which case a lot of art galleries and museums are going to have to be getting rid of a whole lot of their art.

 

Perhaps they can replace it all with pictures of puppies and kittens since that’s all Photoshop appears to be any good for these days.

Votes

Translate

Report

Report
Mentor ,
Feb 12, 2025 Feb 12, 2025

Copy link to clipboard

Copied

quote

Making counterfeit money is illegal.

 

Making nude art is not.

 

... Using nudes to extort, bully, emotionally intimidate, abuse others is, however.

 

GenAI is very, VERY good at that.

 

So using money in art is legal. Making counterfeit money is illegal.

 

Making nude art is legal in some countries, and illegal in other countries. Using nude art to extort, etc. others is illegal.

 

See? Not so simple. Nothing is ever black-and-white. It depends on the context.

 

If I were Adobe I'd think twice before allowing full "creative freedom" and enabling ANYONE --without any artistic technical skills-- to generate photo-realistic offensive and illegal imagery within seconds anywhere in the world.

 

It's all about context.

Votes

Translate

Report

Report
New Here ,
Feb 12, 2025 Feb 12, 2025

Copy link to clipboard

Copied

@rayek.elfin 

No, your argument is blurring the line between tools and actors, deliberately attributing a person's crime to the tool.

Unless a tool is totally, specifically designed for committing crimes, the responsibility for a crime always lies with the person who misuses it, not the tool itself.

Yes, some tools are indeed more prone to misuse. In such cases, the proper approach should be implementing more thorough user qualification reviews, rather than restrict all users to a limited version.

Regarding the claim that creating nude art is illegal in certain countries, the proper approach should be nationality verification - implementing restrictions only for users from countries where such content is illegal. The solution should not be imposing a global restriction just because it is illegal in one country. Otherwise, basic tools like Photoshop or even a painter’s brush should also be restricted, which would be an unreasonable double standard.

You don’t need to defend Adobe. Based on your argument, the best approach for Adobe should be to establish different user tiers, for example:

  • Limited - For those countries does not allow nudity or something.
  • General - Allow most uses, except some contents with age limitation or obvious illegal content.
  • Advanced - Requires verification of age, nationality, and allows the generation of any content except obvious illegal content, under stricter usage commitments (to legally exempt Adobe from liability).
  • Pro - for the users with artistic technical skills, under strictest usage commitments and can generation almost anything.

It is obvious that Adobe has taken the lazy route by making everyone use the limited version with a high-rate-misjudgment moderation, without considering age, nationality, or with/without any artistic technical skills. Adobe is the one who doesn't consider the context.

Of course, this isn't called a mistake - it's Adobe business decision.

Votes

Translate

Report

Report
Mentor ,
Feb 13, 2025 Feb 13, 2025

Copy link to clipboard

Copied

@DdavidChung

 

First, I actually am in favour of full freedom of information and art. If people here want to use genAI to create nudes in their art, then it is simple enough to install open source tools that are available for free and do not censor the output. Get a good Nvidia GPU, and you have full creative control!

 

Secondly, it is impossible to separate the tools from the actors and vice versa. It is well researched and documented that a tool will influence the actor's behaviour, thoughts, intentions, and interactions in the external world. It is an extremely naive worldview to think otherwise. Context is everything. And everything is connected, nothing is truly separable.

 

What is happening in the world right now only proves that the average human being still needs supervision. Earlier in this thread someone mentioned that only children need to be watched over. Obviously many adult humans require the constant threat and supervision of the law and societies policing to behave "nicely". And unfortunately this works on both micro and macro levels: give a country nuclear weapons and their behaviour changes dramatically on the world stage.

 

If it were true that human adults require no supervision we wouldn't still have wars, strife, and untold human and non-human suffering all around our small and fragile planet. We would work together instead and nurture every single human life and non-human life. This, unfortunately, is not the case. 

 

So while I hope for a better future in which humans can be allowed full freedom, reality tells us otherwise at this point in time. Controls and supervision need to be in place. GenAI's threshold for destructive purposes is too low. The proverbial monkey could generate "art" with it.

 

Now, I do agree that it would be possible for Adobe to put firewalls and paywalls up for those who want to use its genAI tools to create nude art. It is however a cost-benefit calculation, and it seems Adobe management decided against potential litigation and a large extra financial overhead to maintain these.

 

Anyway, the tools are out there to do what you @DdavidChung want to do. Redirect your attention elsewhere instead of butting heads with a corporate wall that will not and cannot budge (for good reasons).

Votes

Translate

Report

Report
Community Expert ,
Feb 13, 2025 Feb 13, 2025

Copy link to clipboard

Copied

Very good, @rayek.elfin 🙂 I'd sign off on that.

 

I normally don't get into these dicussions, but now that I'm here, I'd like to specifically point out one thing: what kids do to each other on social media. That alone should make it obvious that you can't simply let this loose.

Votes

Translate

Report

Report
Explorer ,
Feb 13, 2025 Feb 13, 2025

Copy link to clipboard

Copied

I fully support and wholeheartedly agree with this: Limited, General,
Advance, Pro.

Dario de Judicibus

Votes

Translate

Report

Report
Explorer ,
Feb 13, 2025 Feb 13, 2025

Copy link to clipboard

Copied

Well said.

 

Votes

Translate

Report

Report
Explorer ,
Feb 13, 2025 Feb 13, 2025

Copy link to clipboard

Copied

Indeed !

Votes

Translate

Report

Report
Explorer ,
Feb 15, 2025 Feb 15, 2025

Copy link to clipboard

Copied

Considering it is Adobe AI and one of the industry's leaders, it shows how poorly they have trained the AI if it can't differentiate between red marks on a shirt or blood. Or they have the filters cranked up way beyond what they have listed in the TOS.

Votes

Translate

Report

Report
Community Beginner ,
Feb 15, 2025 Feb 15, 2025

Copy link to clipboard

Copied

"Adobe isn’t concerned about people making offensive issues they are concerned that their poorly programmed, loose-cannon software will create a pornographic image when a kid is trying to change a picture of a bug."

 

Except that if I try fantasy generations with women (for D&D character images, for example) they end up being over sexualized in skimpy clothing with obvious surgical "enhancements." I have a hard time getting Adobe to generate family friendly or non-sexist female illustrations, resulting in me having to specify "wearing loose pants and a long sleeve shirt that covers entire torso.." and even then over half of the generated images don't follow the prompts and I end up with a mini skirt and a crop top. If a woman wants to wear that or generate that then great, but if I want to make an image for a 12 year old friend of my son and she wants an image of her Elf Ranger for D&D it is a pain. It is literally the worst of both worlds. You can't create a "no guns" sign for a public building, but you also end up with inappropriate stuff when you don't want it.

Votes

Translate

Report

Report
Community Beginner ,
Feb 16, 2025 Feb 16, 2025

Copy link to clipboard

Copied

Hello dear Adobe,

Guys, are you planning to reconsider the overly strict censorship in Adobe Firefly? A completely decent photo of a girl in a swimsuit—are these p***graphic materials or too much nudity? Seriously? Have you been to the beach or pool recently? The censorship can even target regular summer tops where a woman's stomach is exposed...
Untitled-2 (19).pngexpand image


"Cartoon superhero duck wearing a dramatic cape and mask, holding a toy gun in a heroic pose"—what's wrong here? Violence? Cruelty? Guys, a duck holding a toy gun is a theme from a children's cartoon for kids aged 6 and up...
Untitled-1 (12).pngexpand image

Adobe, you have always made your software primarily for professionals who need to accomplish a variety of tasks. But the overly strict censorship of your AI greatly limits the ability to perform these tasks. I mostly create materials for a lingerie store (not a p*** studio), and I cannot always use Adobe AI capabilities.

Some people, believe it or not, need to use "gun" in their work not to promote violence.

I understand that without strict censorship some people will use AI for not the best purposes. Perhaps there are no other ways yet to teach a neural network what is good and what is bad. But you can maintain strict censorship in projects aimed at a wide audience (Adobe Firefly and Adobe Express web resources) while significantly simplifying it for professional software users from the Creative Cloud package.

Votes

Translate

Report

Report
Community Beginner ,
Feb 16, 2025 Feb 16, 2025

Copy link to clipboard

Copied

Understand me correctly, I do not want to offend anyone's religious feelings with this image. But this method really works!

Is this correct? Is this how professionals should work in professional software?

Untitled-3 (10).pngexpand image

Votes

Translate

Report

Report
Community Expert ,
Feb 18, 2025 Feb 18, 2025

Copy link to clipboard

Copied

Please keep the discussions focused on Adobe Firefly, not general political discussions.

 

Thanks,

    droopy, Moderator

Votes

Translate

Report

Report
Explorer ,
Feb 22, 2025 Feb 22, 2025

Copy link to clipboard

Copied

LATEST

It's interesting that the same image got different reactions from Photopshop. The word "kitchen" was fine. "Bedroom," not so so much.

There are far bigger issues in the world than this silliness. Still, it's worth a thought or two.fill.jpgexpand image

Votes

Translate

Report

Report