Skip to main content
Participant
May 23, 2023

P: Generated images violate user guidelines

 

So as you can see, it's a PG-13 relatively inoffensive image of a woman in a bunny outfit. The top worked fine, and I was able to complete the top ear, which is cool. When I tried to extend the bottom with generative fill, though, I got this warning. They're just a pair of legs wearing stockings, and I wanted to extend it.

It feels like a false flag - though I could be wrong? I find myself thinking it would do the same for women in swimsuits.

Figured I'd share here.

This topic has been closed for replies.

1087 replies

JR Boulay
Community Expert
Community Expert
September 1, 2023

Without prompt:

 

Acrobate du PDF, InDesigner et Photoshopographe
daniellei4510
Community Expert
Community Expert
August 31, 2023

I entered a period . for the prompt and this resulted in removing the hair AND your marching ants. 🙂 Entering a period has been a known work-around for some time instances like these. But yes, I would have used more traditional tools.

Adobe Community Expert | If you aren't submitting your assets in sRGB, you probably didn't read the rules.
Known Participant
August 31, 2023

It seems the latest version of photoshop beta has excessive restriction on the Ai tool.  I came across this error many times when trying to remove unwanted objects from the photo.  In this case, I am trying to remove stray hair from the subject.  I know there are other ways to remove hair in this particular case.  But I don't see how this violate any community guidelines and believe its a bug.  Let me know what your experience is and how to get around it. 

daniellei4510
Community Expert
Community Expert
August 30, 2023

As an Adobe Stock Contributor, I don't use generative fill when working on images I intend to submit. But as a former figure photographer, I do enjoy experimenting with it from time on time on some of my old photographs. Here I added a swimsuit top to a nude model. I got back three examples and no guideline warnings. What you are experiencing is a bug and has nothing to do with Adobe being "moral gestapo."

Adobe Community Expert | If you aren't submitting your assets in sRGB, you probably didn't read the rules.
Kevin Stohlmeyer
Community Expert
Community Expert
August 30, 2023

@AH-2 you do realize this is a voluntary beta test program and not a final public release? You are participating in testing and giving constructive, detailed feedback to help improve the program and AI. As many others have posted in here, how you use the Gen Fill or other AI driven components and utilize effective prompts can make all the difference between a false flag failure and a positive outcome.

Participant
August 30, 2023

@Kevin Stohlmeyer 

I already described the problem and I don't complain. What I can do though is that I can, and probably will, switch to a different software. There are choices out there now. Adobe will lose a customer and somebody else will see my money.

Kevin Stohlmeyer
Community Expert
Community Expert
August 30, 2023

@AH-2 so you can give examples then? Help us help you. Or just complain, your choice.

Participant
August 30, 2023

@Kevin Stohlmeyer 

What makes it even more pathetic that I am not even trying to modify the image itself. All I am trying to do right now is outcropping the background. Sometimes I do have to overlap with the image to get the generative expand. I tried different lenghts of overlap, none of them worked. As far as the prompts, where should I start? We can open the dictionary and it's all there, by the numbers.

Kevin Stohlmeyer
Community Expert
Community Expert
August 30, 2023

Hi @AH-2 thank you for your feedback. Can you supply screenshots of the selection and specific prompts you are using when you get a failure that may be an error?

Participant
August 30, 2023

The app is totally useless. I do edit photos of women, but they are all DRESSED  for God's sakes. There's no nudity whatsoever in the pics. 10 times out of 10 I get guideline violation message. Prompt, no prompt- same result. It looks like when the cloud detects ANY type of a female in the picture, it rejects it by default. This is completely ridiculous. Well, gonna have to wait for some other AI plugin to come up to do the job.