• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
293

P: Generated images violate user guidelines

Community Beginner ,
May 23, 2023 May 23, 2023

Copy link to clipboard

Copied

Bunny.png

image (1).png

 

So as you can see, it's a PG-13 relatively inoffensive image of a woman in a bunny outfit. The top worked fine, and I was able to complete the top ear, which is cool. When I tried to extend the bottom with generative fill, though, I got this warning. They're just a pair of legs wearing stockings, and I wanted to extend it.

It feels like a false flag - though I could be wrong? I find myself thinking it would do the same for women in swimsuits.

Figured I'd share here.

Bug Started Locked
TOPICS
Desktop-macOS , Desktop-Windows

Views

214.2K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , Nov 10, 2023 Nov 10, 2023

Dear Community,

On November 7th, 2023, the Firefly for Photoshop service was updated and improved for this issue. You should encounter fewer guideline errors when working on or near skin-tone areas that do not violate the community guidelines.

While the improvement is a big step in the right direction, we are continuing to explore new ways to minimize false-positives. Please continue to give us feedback on this new forum thread and also report false violation errors in the application.
Thank you

...

Votes

Translate

Translate
replies 1382 Replies 1382
1,381 Comments
Community Beginner ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

blue1.jpg

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

I get this violation warning more than 50% of the time, and mostly when it involves images of people.

Sometimes I can make several generations ok, then I get the warning, and it will go no farther.

One generated image it produced was a woman with 3 legs, which gave me an idea, so I searched for a woman with 3 legs...... Violation of user guidelines, so I tried a man with 3 legs, same thing.

So I searched for a dog with 3 legs, and it came up with an image of a dog with an erection???

What a shower of hypocrites

Votes

Translate

Translate

Report

Report
Community Expert ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

But the bot isn't flagging the object. It's flagging its own results.


daniellei4510 | Community Forum Volunteer
---------------------------------------------------------
I am my cat's emotional support animal.

Votes

Translate

Translate

Report

Report
New Here ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

I get the same thing when I tried to add smoke to a picture of a waterfall. I selected the water, clicked generative fill, and typed "smoke". I was then met with the same error that my image had been removed due to violation.

I also tried to have a ninja's cowel, cloth hood generated over my face while I was wearing a red clown nose. I was met with the error. The same thing also happened when I tried to generate "blood". I wanted to replace water in a waterfall with blood.waterfall.jpeg

Votes

Translate

Translate

Report

Report
Community Expert ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

Blood is most likely a banned word.

daniellei4510 | Community Forum Volunteer
---------------------------------------------------------
I am my cat's emotional support animal.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

I'm getting a"user guidelines" violation error when using genterative fill on various things. And I'm violating absolutely NO guidelines, unlesscreating rusty metal and crackled paint is forbidden. LOL.

MetalForSigns.png

Unless saying "white paint" was a problem. Surely any color is acceptable and it knows the difference between the drug crack, and crackled paint?  Help. Thanks, Bryan S.Welborn

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

If the bot flags it's own results, what the error message indicates, and the image source of the AI is Adobe Stock, ... um ... 😉

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

I couldn't quite get the result I wanted hence why I was trying to have it auto fill

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

@Kevin Stohlmeyerforgot to tag you in the above reply.

Votes

Translate

Translate

Report

Report
Engaged ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

I typed smart suit in the prompt box and it came up with a superhero costume, in a female body at that😆😆

On facebook I jokingly told a female friend to shut up or I'd slap her I got banned for 14 days ROFL

Votes

Translate

Translate

Report

Report
Community Expert ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

Yep. That's the odd part. The source images are supposedly Adobe Stock and Adobe Stock only, which represents a relatively small sampling of images compared to dedicated AI text to image applications. Like maybe millions compared to billions? I dunno. While there are images of what I would call PG-rated nudes and those depicting relatively mild violence on Adobe Stock, it's difficult to understand why the bot is coming up with so many results deemed to be in violation of the guidelines. So the bot does seem to go rogue quite often at this point in time. Hence the reason for the beta. I think this is an issue that will be ironed out in the months ahead. This isn't going to be a short beta period. It's going to take some time. 


daniellei4510 | Community Forum Volunteer
---------------------------------------------------------
I am my cat's emotional support animal.

Votes

Translate

Translate

Report

Report
Engaged ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

so you'll have a job doing horror images ?  The whole AI scheme is very early yet, even those apps that change voices are rubbish. I've noticed a lot of you tubers are using the text to speech generated ones. All in all they're ok, BUT it's hilarious when the publisher doesn't check the finished result first.  Anyhoo back on topic. I was removing people from a photo in woodland I took a couple of weeks ago and even THAT violated user guidelines. They REALLY need to up their game or it's going to chase a lot of users away

Votes

Translate

Translate

Report

Report
Community Expert ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

Why would it chase users away? Adobe existed for years WITHOUT generated
fill. If they shut the feature off tomorrow, they wouldn't lose a single
customer. Users would just have to go back to working a little harder to
make their edits the old-fashioned way.

daniellei4510 | Community Forum Volunteer
---------------------------------------------------------
I am my cat's emotional support animal.

Votes

Translate

Translate

Report

Report
Engaged ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

THAT was a silly thing I said, yes of course regular users such as myself will always stick with adobe. Just a pity it's become so expensive

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

The generative AI rules need to be modified to not be so restrictive with expanding upon photographs that have to do with the military.  I entered a prompt while working on an old photograph of my Grandfather taken in 1940 while he was training.  My prompt read, "a man in WWII field training attire" and it was flagged as violating the rules.

 

I've seen others try to make something historic like a musket without success, yet a bow and arrow is acceptable.  Each fires a projectile with the same goal.  Nonetheless, the context of the prompts has to somehow be taken into the equation.  Otherwise the service alienates many forms of art expression.

Votes

Translate

Translate

Report

Report
Advocate ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

Tons of people seem to have flooded in just for this feature, so yes, they could be lost again.

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

The AI is why I joined 

Votes

Translate

Translate

Report

Report
Engaged ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

Here's a new one. I had myself saved as the subject and typed colorful shirt. I was amazed to get THIS error. Sorry We don't recognise your language as English, please try again. I mean I even spelt Colorful the American way to help the Bot understand ROFLPSML

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 03, 2023 Jun 03, 2023

Copy link to clipboard

Copied

I often get these guideline problems and I have no idea why.  Can they add more details on why this was a violation?  I just clicked generate without adding any text.

Screenshot 2023-06-03 203808.jpg

Votes

Translate

Translate

Report

Report
New Here ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

forbids me to change my OWN pictures in firefly!  - Worthless!!!

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

Its getting a bit mad now, Out of 15 attempts, about 12 where refused. 

Votes

Translate

Translate

Report

Report
Engaged ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

I reckon I've probably had roughly the same amount of failed attempts. MANY of the ones that DID get through are dreadful

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

The 'violate community warning' is problematic.

If the underlying image is perceived as 'risky' then it won't work, even when you ask for normal things, flowers, furniture, shelves, window. Those are not problematic items. The fact that the base image is considered unsafe, that's a bit of an over-reach, in my opinion. AI is producing flowers, where I put them is up to me. 

On some images AI will reach into unselected regions and alter things. I thought the whole point was to select out what you don't want altered and only work inside the selection.

Ex. on a skimpily dressed person when the background is selected and asked for something it did the background but also put underwear on the person, even though the person was unselected, so it reached outside the selection and changed underlying image.  I asked for flowers in the background, and I got flowers AND underwear on the model even though model was not selected and that layer deactivated/hidden below.

Too far.

 

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

But its their own images they are flagging 

Votes

Translate

Translate

Report

Report
New Here ,
Jun 04, 2023 Jun 04, 2023

Copy link to clipboard

Copied

I'm receiving the same error, but I'm not trying to "generate" anything.  I'm trying to remove the letters WK from a fully clothed person's shirt.  I leave the generate box blank so it will fill based on the surroundings.  If I had to guess, the issue is that the subject is wearing a short-sleeved shirt and part of their arm is the "surrounding" area and I'm further guessing that the use of skin is triggering something, even though it's just an arm.  And what is frustrating is that I don't want it to generate any skin, I want to remove the lettering from the shirt, so I only want it to fill in with the shirt colors. 

Votes

Translate

Translate

Report

Report