Skip to main content
Inspiring
October 24, 2023
Question

Generative Fill Guidelines and Censoring

  • October 24, 2023
  • 16 replies
  • 19167 views

I have some photographs where I would LIKE to change the shirt or clothing of some of my family members.  These are MY photographs that I have taken over the years.   A couple of the photographs are of shirts that are somewhat revealing in nature, NOT nudity, kind of low cut in nature.  If I select the shirt with a selection tool, and try to change the shirt or top using Generative Fill, I get the dreaded Notice that my request violates the Guidelines.    NOTE:   I am NOT creating nudity, or removing nudity, simply trying to replace a Woman's shirt, and a Male Shirt, (Old Tank Top).    am simply trying to create NEW shirts.   So, I do not understand what guideline I am violating.   Again, these are OLD images of mine, where I am just trying to release shirts with NEW shirts. 

 

Now, here is the WEIRD part.   After I got the guideline notice, I tried to see if I could create an entirely NEW document with Generative Fill, and I typed in, "Girl wearing Bikini walking on the Beach with the Ocean in the Background",  and it created it.    So, I can create a Bikini girl, but I can not replace a semi low cut shirt on a woman, or a tank top on a male.    Can somebody explain a way to get around this, or will this be fixed soon?

16 replies

KissMyKite
Participating Frequently
August 8, 2025

I've been a photographer for over 30 years and I'm upset, as are the designers, magazines, ad agencies, and my fellow photographers and editors, that Adobe is censoring all our work.

 

I shoot for Vogue, ELLE, Harper's, NYFW, Couture Weeks in NY, London and Paris, and Adobe is censoring everyone's works.  I take one easy step to expand a frame on a 4x3, and I'm told my work does not meet community standards.   

 

I think the entire industry should be in an uproar over this.    These are artists' tools.  If you buy a tube of paint or a canvas, does not allow you to paint what you want? If you buy a notebook, does it tell you what you can write?   Adobe is censoring artists and I think it's awful.

 

 

 

 

 

 

 

KissMyKite
Participating Frequently
April 14, 2025

I shoot for Vogue (Conde Nast), Harper's (Hearst), Sports Illustrated Swimsuit, WPP (Ad Agency) and The TV Show Dancing with The Stars (BBC Studios).  Everyone from the editors to the creative directors are very frustrated with Adobe.  For the first time since I can remember, one firm is changing their entire WAN/SAN edit suits for NY, LA, Paris, Spain because they feel the fashion industry has had it with Adobe.   I get it and if Adobe wants to censor what you can edit they will just continue to hear people walking away. A simple pair of shorts, a skirt, a bathing suit, a backless dress an Olympic Athlete all examples of Photoshop destroying workflow.   This could be Adobe's BUD LIGHT moment.  

Inspiring
May 28, 2025

My goodness. Please do something about this isssue.

I get the dreaded brown box way too often when trying to edit photos showing too much skin even though the "model" is wearing clothing that's appropriate for being in public.

I recently tried to change a shirt on a model and after waiting , I was rejected.

 

 

 

 

 

MatrDatr
Participant
April 23, 2025

Still not Fixed, I have the same type of issue, not just once but constantly. All I want to do is fix skin blemishes, skin tones, make smooth skin.  The word skin almost always triggers a violation.  Other words that trigger violations: lips, hair, blonde hair, bikini, mini skirt, woman, face, nose. Not like I am creating masterpieces, much of what i do is for Amazon review photos that I want to look better. All photos are fully clothed, though they might be with underwear or bras - you know the stuff they sell on amazon. Try taking a photo of yourself in underwear, then edit it PS and use gen-fill to smooth leg skin - Big ugly blinking orange message pops up saying I violated community guidelines and I need to read them.  Well, I've read them at least 5 times and don't see what I am violating. There is no advice on how to correct the issue and none of the suggestions I seen so far are of value – (IE: use a different program, are you kidding? PS is the world’s best photo editor, it needs to be able to do these simple tasks without upsetting users with censorship)
Sure doesn't seem like they are listening, going to install Stable Diffusion this weekend and try it.  I read there a plug-in versions for it inside photoshop with much less limits on content generation. Instead of a big ugly violator error, I wish the response would say specifically what is unacceptable and then help me to correct it and get PS to do what I need.  Afterall this is what was advertized and why I pay monthly for it.
Oh my :The message body contains [removed], which is not permitted in this community. 

 

[abuse removed by moderator]

Marc
Participant
March 7, 2025

I am an adult customer of Adobe's. I am also a professional artist. I have been using photoshop regularly for nearly 30 years now.
I feel I should be trusted by adobe that I will use this technology in a safe and tasteful way. The current restrictions are rather off-putting, as photoshop is an essential tool in my process for creating my art. I would like to continue to use it and grow with it in an unrestricted way.

 

I have read the generative ai terms and I find it concerning that nudity would be among the list of prohibited types of content, along side such disgusting categories as terrorism and child exploitation. Nudity should not be condemned. Especially not for adult artists and working professionals that pay for access to tools like Photoshop.

The human body should be appreciated, and not categorized or condemned along side of things like terrorism. I thought Adobe would have understood that restrictions on artists are restrictions on our creativity.  It changes how I feel about using the software. In my art, if I paint the likeness of human (adult/tasteful) nudity, I am not doing anything wrong or of any concern of anyone outside of myself and my client I would like to feel that Adobe respects me as a competent adult and artist. To limit my creative expression in any way simply because of what some other criminal delinquent might do with it, is insulting and inappropriate. And it goes against the freedom of expression that is what makes art so personal and compelling.

AxelMatt
Community Expert
Community Expert
March 7, 2025

This is a long-term topic here and there are countless entries in the appropriate forums.

For example see here: P: Generated images violate user guidelines – Seite 55 - Adobe Community - 13811808

 

My System: Intel i7-8700K - 64GB RAM - NVidia Geforce RTX 3060 - Windows 11 Pro 25H2 -- LR-Classic 15 - Photoshop 27 - Nik Collection 8 - PureRAW 5 - Topaz Photo
davescm
Community Expert
Community Expert
January 25, 2025

@davescm This is extremely unhelpful. Why post this reminder without pointing out what the issue is

 

Passive aggressive bot-posted nonsense like this should be against community standards.


@byronw41971227 @robertw44482386 The post I was responding to, with the link to the guidelines, has since been removed.

Dave

robertw44482386
Inspiring
January 25, 2025

@SheldonvL Please read the community guidelines before posting.
https://community.adobe.com/t5/using-the-community-discussions/adobe-community-guidelines/td-p/4788157


Dave


@davescm This is extremely unhelpful. Why post this reminder without pointing out what the issue is

 

Passive aggressive bot-posted nonsense like this should be against community standards.

Inspiring
January 25, 2025

@SheldonvL Please read the community guidelines before posting.
https://community.adobe.com/t5/using-the-community-discussions/adobe-community-guidelines/td-p/4788157


Dave


If something was said, or posted against the Community rules, please just say so.   When you post a link to the rules, it appears as though you are saying something was violated, and we have to GUESS what was violated.

 

So, please, just advise us, of what was a violation.  I am quite sure NOBODY meant to violate any community guidelines, we just asked a simple question.

davescm
Community Expert
Community Expert
January 25, 2025
robertw44482386
Inspiring
October 25, 2024

Having this issue exactly. Wake up Adobe. People have breasts and we have to retouch the material that clothes them, just the same as bellys ankles and elbows.

Inspiring
November 1, 2024

I am sure they will at some point reconsider their Guidelines, but it does seem a little heavy handed to not allow Subscribers to use the tool when NOTHING is wrong with the photos we are trying to fix.   It is not like we are trying to create NUDITY at all.    And regardless, nudity is part of Photography, as long as the nudity is not pornography, or of children.  Then it is STRICTLY against the law, and Adobe could simply report it.    So, I expect Adobe to fix this issue at some point.

Participant
March 26, 2025

I'm having the same issue! I've been trying to simply remove a bit of unwanted chest hair from a MAN (you can't even see his n*pples! -- yes I'm censoring myself because who knows) but they keep telling me it's against guidelines. I love the fact that they're kind of acting like children about the idea of nudity in photography, not allowing us to even use creative nudity. Eye roll. I hope they do something to fix this. Just wanted you to know you're not alone in your frustration!

Participating Frequently
October 19, 2024

I love the tool to select an area of my picture and use generative AI to fix or enhance it but I hate the inaccurate content restrictions. I am working on illustrating the bible, which is a huge task even with AI to help. Often I find there are problems with hands, feet, and faces. It is great to be able to fix them with the lasso tool and generative AI. The sad truth is that I am constantly being hit with guideline violations that seem to have no basis that I can see or understand. A general guideline violation message is not enough.

 

For example, Here is a picture I am working on. It is a scene from the bible. 

There are a number of problems with the people in this scene that I'm trying to fix. For some reason, these two heads will not work with any prompt I can think of. I've tried long prompts like "a woman in a biblical scene with a serious expression on her face". I've tried short prompts like "Woman face". No matter what I try, I get the violation pop-up. 

      

No matter what I try to do I get a guideline violation. There is no explanation of what guideline I am violating. I can't figure it out. I am constantly sending feedback on the images I work on but nothing seems to improve. In fact, it seems to be getting worse. At times I feel so frustrated that I am actively looking for an alternative to Photoshop. Has anyone had a clue about how to deal with this roadblock? 

steveisa054
Known Participant
May 31, 2025

This has happened to me many times already. At times the subjects in my photos are women wearing bikinis, or a bikini top with a full covering over the lower body. Sometimes my subject is wearing lingerie which is not see thru. If I try to use generative fill to remove an object in the background, or remove flyaway hair from my subject, it fails with a warning that I am violating the community guidelines. Why I think this behavior is wrong: In all cases, I am not using generative fill to modify the subject itself, neither clothing nor body, but only what is in the background, or stray flyaway hairs from the head that is extending to the background. In all cases, there is no nudity at all. Bikinis and lingerie are not nudity, at least not in Europe where I live. And the portions of the photos which are being requested to be modified by the generative fill are not involving at all those parts of the body which the community guidelines might be concerned about. Why this is an irritation: I can spend a lot of time on the selection of the background objects or the area of flyaway hairs that I would like removed, and all this is wasted when I am not allowed to use the generative fill due to the community guidelines. It also means that for a great deal of my photos, I cannot use generative fill at all, unless I perform a more complex procedure where I cut out the area that I need to apply the generative fill, and then add the results to the original photo. How about basing the community guidelines on the areas that are being affected by the generative fill rather than the entire photo? How about beefing up the algorithms so that bikinis and non-see-thru lingerie are not considered nudity?

EDIT: I am not sure if it was me posting incorrectly, or if the forum software has done this, but my post here had been meant to be a completely new post, not a reply to an existing post. It could very well be that putting my post as a reply here is appropriate, but it was not as I intended. I hope people who read my post keep this in mind.

Known Participant
June 3, 2025

Thank you for posting this. All well said and I agree with and understand your frustration, this is beyond ridiculous for a platform we are paying for that also allows scrubbing our info and selling it all while additionally throwing ads at us. I think the problem runs deeper than just a few overzealous filters. Nudity is not illegal. And if we’re not deepfaking, not violating consent, and not doing anything unlawful—why are we being blocked from something as mundane as removing a hair from a bikini in a fashion shoot? (although - any blocking that exceeds legality, is Adobe trying to act as the world's gatekeepers, and that is a serious concern that will harm the art industry as we move forward).

Let’s look at Adobe’s own rules:

  • No explicit nudity (why? Artists do art. Infinite levels of art. I guarantee 100%, Adobe does NOT block Hollywood from using their apps for explicit nudity. Which, technically, may be legal framework for artists to unite and file a lawsuit)

  • No hateful, gory, or misleading content (subjective and selectively applied. Again - They wouldn't limit hollywood? Why does Adobe get to be the gatekeeper on this? Hateful is so subjective and easily manipulated in woke and post-moderism political agendas, that it these restrictions need to not be used against artists, where artists are usually the last stand against such things.)

  • No violating regulated activity (Beyond vague, albeit we can assume "not breaking the law". But - clearly, every other rule is so restrictive, we couldn't. So, what's the point?)

  • No privacy violations—ironic, given Adobe’s model is trained on ours, and other artists' work.

  • No self harm, violations of children, terrorism - understood and agreed. THIS should be the ONLY regulation.
  • No disseminating fraudulent or misleading material - again, who's the gatekeeper? Adobe? Why is the world governed by their corporate AI?

Here's the additional problem: None of this applies to what most of us are trying to do. We're not asking for illegal, illicit, or even "risky" adult content. We’re asking for basic image correction, religious illustration, and photo enhancement—all tasks Adobe built its reputation on.

Instead, we get AI tools that block us arbitrarily and then fail to perform at even a basic level. Worse, Adobe trains its AI on our work, reuses it, and offers us no protections in return.

It’s not about making content safer. It’s about minimizing Adobe’s liability—at the cost of usability, trust, and artistic freedom. And if that's the new direction, Adobe should stop pretending this is about community ethics. It's corporate risk management dressed up as virtue.

It’s time Adobe applied its rules to itself—or let its users create without walking on eggshells.
Oh - and make the stupid program work. I mean, just the "generative expand tool" - only works when the crop tool is applied - only works when the canvas is expanded. And, unless you have those features up and ready and use them regularly, to get to use the generative expand tool - it becomes a horrible user experience. Adobe is definitely beyond NOT listening to users and has exceeded WELL BEYOND NOT being a tool for artits. imho.