Skip to main content
Participant
May 23, 2023

P: Generated images violate user guidelines

 

So as you can see, it's a PG-13 relatively inoffensive image of a woman in a bunny outfit. The top worked fine, and I was able to complete the top ear, which is cool. When I tried to extend the bottom with generative fill, though, I got this warning. They're just a pair of legs wearing stockings, and I wanted to extend it.

It feels like a false flag - though I could be wrong? I find myself thinking it would do the same for women in swimsuits.

Figured I'd share here.

1084 replies

Kevin Stohlmeyer
Community Expert
Community Expert
August 23, 2023

As with any Beta, it will continue to improve with user feedback and constructive input. Thanks!

Known Participant
August 23, 2023

@Kevin Stohlmeyer  Thanks for the suggestion, and sorry for the outburst. Yes, I'm struggling with "closed hand" which gives me all kinds of open hands, which I then have to fix part by part to close them.

 

 

Closed fingers is also something that's better, but it would be much easier to just being able to write fist.

 

Kevin Stohlmeyer
Community Expert
Community Expert
August 23, 2023

@zvi_t first, thank you for being a tester in this beta. Second, I've found it helpful to Google synonyms for violating terms. Instead of "fist" why not try "closed hand"?

Known Participant
August 23, 2023

Baby fist. Baby fist hand. Baby's hand like a fist.  VIOLATION for any way I use the word "fist."

1. I do not want a fist punching anyone!

2. I do not want a fist killing anyone!

3. I do not want a fist committing a crime!

 

Al I want is a fist of a baby!!!! How long will it be until Adobe fixes this ridiculous issue?

 

Graham24508943nobd
Known Participant
August 23, 2023

Do a google image search for smoke with transparent background, simply save the pic, copy & paste into your image.

Known Participant
August 23, 2023

I've been trying to add a fringe (what we in the UK call them - in the US it's 'bangs' I think) to a woman's hair. I've tried in the web version of text to image, then tried in the web version of generative fill, and now I'm using PS Beta 25.0.

If I select the forehead and ask for 'fringe' it gives me all sorts of weird ornaments, straps, bows and indecipherable things, but if I ask for 'bangs' it gets blocked as a violation! Not my fault the US decided to call a part of a hairstyle 'bangs' (no logic to that name that I can see whatsoever). Bangs is either violent as a gun or bomb 'bangs' or is sexual in Adobe's mind it would seem, and not possibly used in any other innocent context.

I've tried 'hair fringe', 'hair bangs', 'straight bangs' and more - only the words using bangs has something resembling hair and not objects, but they are usually parted and don't cover the forehead.

This is going to be confusing and unintuitive as heck if words don't give an obvious result (I'd say it should be obvious if the AI is smart enough to recognise that the area I've selected is on someone's head at least, and surrounded by hair) or the words we need to use are getting blocked because they can also be used in an 'inappropriate' context. 

Solution: don't block so many words. 

Better solution: don't block anything and let people create what they want and then they can later suffer the potential consequences with wherever they use the image next - just as they would with painting, drawing, sclupting, photography or any other form of art that springs to mind.

Graham24508943nobd
Known Participant
August 23, 2023

Red, try using swimming costume instead of swimwear. Works for me 

Graham24508943nobd
Known Participant
August 23, 2023

I don't find it slow, but it's far worse at removing objects and adding objects that just aren't anything if you know what I mean ?  I asked for a patio and it added some gawdawful monstrosity that would NOT move even after deleting the layer. I had to scrub the project I was working on 

 

xandrwrld
Participant
August 23, 2023

It appears that the word "symmetrical" is a prompt that violates the guidelines. I am simply trying to put in a set of steps into this image, but that keyword seems to flag it every time. This has been happening a lot recently with benign, simple things that do not go against anything in the guidelines. I think it's time for a review of the PSGF's censorship algorithm.   

Known Participant
August 23, 2023

But can you select specific areas with a selection tool in those program's images and make amendments? Or do you have to use a text prompt to tell them where to make the change, and it has to correctly identify that area e.g. a left arm, or right ear, floor? 

matiasrengel
Participant
August 20, 2023

as you can see in the image I am trying to fill in where there was a Microphone and it just doesn't want to.