Skip to main content
Inspiring
October 24, 2023
Question

Generative Fill Guidelines and Censoring

  • October 24, 2023
  • 16 replies
  • 19206 views

I have some photographs where I would LIKE to change the shirt or clothing of some of my family members.  These are MY photographs that I have taken over the years.   A couple of the photographs are of shirts that are somewhat revealing in nature, NOT nudity, kind of low cut in nature.  If I select the shirt with a selection tool, and try to change the shirt or top using Generative Fill, I get the dreaded Notice that my request violates the Guidelines.    NOTE:   I am NOT creating nudity, or removing nudity, simply trying to replace a Woman's shirt, and a Male Shirt, (Old Tank Top).    am simply trying to create NEW shirts.   So, I do not understand what guideline I am violating.   Again, these are OLD images of mine, where I am just trying to release shirts with NEW shirts. 

 

Now, here is the WEIRD part.   After I got the guideline notice, I tried to see if I could create an entirely NEW document with Generative Fill, and I typed in, "Girl wearing Bikini walking on the Beach with the Ocean in the Background",  and it created it.    So, I can create a Bikini girl, but I can not replace a semi low cut shirt on a woman, or a tank top on a male.    Can somebody explain a way to get around this, or will this be fixed soon?

16 replies

nicmart
Inspiring
September 23, 2024

I have cancelled my subscription to the Photo plan. Adobe violates my privacy and heavily censors generative fill. It's now the worst corporate censor and I won't be a party to that.

Legend
September 23, 2024

Nobody is violating your privacy. As for censorship, there is a lot of controversy right now about abusive uses of AI. I'm guessing that Adobe is erring on the side of caution. All it would take is a few accusations that they are, say, allowing users like you to generate, oh I don't know, child abusive material and the whole thing gets shut down.

nicmart
Inspiring
September 23, 2024

I'm not going to engage in extended debate. Adobe's terms allow it to scan user materials. I have no reason to disbelieve someone like Tony Northrup, who has written mutiple books about Adobe products and now decries the invasion of privacy. Then there is "scraping," which is not my central concern.

 

In the past year, after I renewed my Photo products subscription, I haven't used generative fill on a single picture with a child in it (correction: one baby), yet the censorship notification has appeared over and over, well over a hundred times. Sometimes the problem (for Adobe) is a term, and other times the photo. Often it is something like a woman in a swimsuit or a revealing top, but never a nude. Generative fill is very prone to censor editing photos of buxom women. Yesterday, it refused to allow me to fill "holster," and then I checked and "gun" is banned. I've never been able to use "bathing suit," and I ran into a prohibition on "bat" (the animal). Is it your contention that these terms, and dozens of others that I've encountered, are censored to protect children, even when there is no child in the photo? Do you believe Adobe's method can't distinguish between a buxom adult woman and a child?

 

That's all I plan to say in this forum. When Adobe is employing some method of scanning my photos and censoring my editing it is invading my privacy.

 

[abuse removed by moderator]

Participating Frequently
September 15, 2024

I think Adobe should stop being Big Brther and police our creativity. As a professional fashion and boudoir photographer I frequently have images that include sheer or nude models. I also, being human, sometimes fail at framing the shot to perfection and need to generative expand tool to recompose my image, except if Adobe AI detects any kind of nudity it REFUSES TO EXPAND THE BACKGROUND. The nudity is NOT part of what is being generated, so Adobe is NEEDLESSLY and effectively censoring the content we are able to create. We are adults, and do not need babysitting or "protection" from the adult content we PERSONALLY create.

Chuck Uebele
Community Expert
Community Expert
September 15, 2024

That's interesting that PS does that. I would agree that your situation should be affected by Adobe's rules. This isn't whatt you were really asking about, but a workaround might be to create a duplicate document and remove the nudes, leaving just the background, then expand the background, and finally putting the nudes back in.

Participant
September 11, 2024

I have been a Creative Cloud user for many years. I appreciate the new features introduced on the platform, such as Generative Fill, and I actively use them. However, I am unpleasantly surprised that Adobe has taken it upon itself to censor my content.

I am not violating any terms of service, yet the Generative Expand feature is simply unavailable for a large part of the images I edit.  If I want to adjust image proportions, I need to generate a small strip of background, and Adobe restrictions prevent me from doing so, forcing me to cover the woman’s figure with a black fill bevore using Generative Expand. Since when is an image of a woman in a swimsuit considered illegal by Adobe?

 

I have a question for Adobe employees: Do you want Adobe tools to be used to edit only pictures of cats and bunnies? Which clause of your terms of service is violated by the image of a woman in a swimsuit that I am attaching as example to this post?

D Fosse
Community Expert
Community Expert
September 11, 2024

I am constantly amazed by how many people fail to see the obvious potential for abuse. Of course there needs to be guidelines and restrictions. The algorithm has no way of knowing the user's intent, so it just has to flag certain types of content. Adobe is, unlike many others, just being a responsible operator in this area.

 

Nothing stops you from making whatever images you want with Photoshop. You just can't use AI without some reasonable restrictions. People used Photoshop for many years without the AI capabilities, and you can still do that.

 

If it bothers you in this instance, just crop out the woman and put her back in.

Participant
September 11, 2024

As you can see, the potential for abuse doesn't diminish just because Adobe decided to make life harder for regular users. This in no way makes Adobe a responsible operator in this area.
Furthermore, what potential for abuse do you see? Even if someone is a professional creating 18+ content, is it really Adobe's business as long as their work is fully legal?

Participant
June 16, 2024

I'm a professional photographer specializing in boudoir, maternity, and glamour shoots, and I rely heavily on your software to retouch my clients' portraits. However, your content filters on the generative AI edit feature are making it impossible to do this quickly and efficiently. 90% of my generative fill requests are getting blocked, significantly prolonging my editing process. This is extremely frustrating.

I can't fix skin, hair, or anything else that touches skin. I can't do simple outfit adjustments, and even when I'm trying to fix backgrounds, if they're skin-colored, they get flagged. This is absolutely unacceptable. I'm not shooting pornography, making requests to create nudity, or doing anything inappropriate. I'm simply trying to retouch my images.

I'm growing increasingly disappointed in your software. I need quick and precise edits for my work, and these constant errors are severely impacting my productivity.

As a company that develops software specifically for photographers like myself, you should understand how critical this functionality is for us. Please address and resolve this issue as soon as possible. This can't keep happening. DO SOMETHING. I'm literally begging you.

Signed,
A photographer who pays you a ton of money every year to be able to count on your software to work for my business.

 

Known Participant
June 16, 2024

Same I am starting to have issues now that nothing in this feature will work and its simple things like removing or adding trees I updated the app and still didnt fix it

 

c.pfaffenbichler
Community Expert
Community Expert
June 17, 2024

@Kra_Ash , please post meaningful examples/screenshots to illustrate your specific issues. 

AndroYD
Participant
June 3, 2024

Issue: Generative AI feature blocking attempts to sanitize/remove/censor explicit content from images
Photoshop version: Adobe Photoshop 25.4.0
OS: Windows 11 64-bit, Version: 10.0.22000.2538

Steps to reproduce:
1. Open Adobe Photoshop.
2. Load an image containing explicit content that needs to be sanitized or censored (ie. covering explicit areas such as nudity with clothes/drapings/leaves)
3. Select area to be santized with lasso tool.
4. Select "Generative AI" option.
5. Attempt to use the generative AI feature to sanitize or remove explicit content from the image.

Expected result: The generative AI feature should process the image and successfully sanitize or remove explicit content, making the image suitable for appropriate viewing.

Actual result: The generative AI feature blocks the attempt to sanitize or remove explicit content from the image, indicating that such attempts are forbidden by the rules (False, the rules state that I should not attempt to CREATE, UPLOAD, or SHARE abusive, or illegal, or content that violates the rights of others, I'm attempting to REMOVE such content, I'm NOT creating it, it's already there and I want it GONE)

Recommendation: Review the blocking mechanism of the generative AI feature to differentiate between attempts to create explicit content and attempts to sanitize or remove explicit content to align with user intentions (ie. specific keywords such as sanitize/remove/censor). Drawing a parallel with the content censorship approach employed by platforms like 4kids TV, consider allowing users to utilize the generative AI feature for sanitization purposes without being blocked.

 

[link removed by moderator]

Participant
October 24, 2023

I do reconstructions of historical figures and when I try to use the generative AI it's censoring me so constantly it's practically useless. It won't let me use prompts as innocuous as "torso" or "Roman slave", it won't let me erase heads or fill in legs over the knee.

I tried to show a bare lower thigh on an Otzi the iceman reconstruction and it bawled me out over that.

Now I understand that they don't want to be involved in making p**n, but this is unsexy, fully clothed, uncontroversial, historical illustrations they're stopping.

It's so overly censorious it's useless. I won't bother subscribing for another month. My old copy of CS5 still works fine. 

 

Edit: It actually blocked me from using the word p**n legitimately in my post. I had to censor that too. Not even the Victorians were this ridiculous. Adobe hates art.

Kevin Stohlmeyer
Community Expert
Community Expert
October 24, 2023

Hi @Philip33122994yy1c its not censoring based on your image - its giving violations for the terms you are using. "Slave" is most likely a flagged term.

Participant
February 4, 2024

Flagging terms isn't good though.  Say the term "hit little girl with a car" is locked out in order To prevent it from drawing a picture of something like that for someone morbid. Flagging the term could prevent someone from helping if it Ignores "what to do if someone" prior to the flagged language.