• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
276

P: Generated images violate user guidelines

Community Beginner ,
May 23, 2023 May 23, 2023

Copy link to clipboard

Copied

Bunny.png

image (1).png

 

So as you can see, it's a PG-13 relatively inoffensive image of a woman in a bunny outfit. The top worked fine, and I was able to complete the top ear, which is cool. When I tried to extend the bottom with generative fill, though, I got this warning. They're just a pair of legs wearing stockings, and I wanted to extend it.

It feels like a false flag - though I could be wrong? I find myself thinking it would do the same for women in swimsuits.

Figured I'd share here.

Bug Started Locked
TOPICS
Desktop-macOS , Desktop-Windows

Views

153.7K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , Nov 10, 2023 Nov 10, 2023

Dear Community,

On November 7th, 2023, the Firefly for Photoshop service was updated and improved for this issue. You should encounter fewer guideline errors when working on or near skin-tone areas that do not violate the community guidelines.

While the improvement is a big step in the right direction, we are continuing to explore new ways to minimize false-positives. Please continue to give us feedback on this new forum thread and also report false violation errors in the application.
Thank you

...

Votes

Translate

Translate
replies 1379 Replies 1379
1,378 Comments
Community Beginner ,
Jun 01, 2023 Jun 01, 2023

Copy link to clipboard

Copied

Well, the guidlines says "No NSFW", this has nothing to do with Beta.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 01, 2023 Jun 01, 2023

Copy link to clipboard

Copied

I'm curious what happens when the bug is removed.
In many parts of the world, NSFW regulations are not nearly as restrictive as in the US. Topl-e-s-s women are even shown on public TV. (oh, naked men too ;-)) (In my case Europe / Germany, and don't ask what's going on in France or Italy). For sure countries with large amount of Adobe customers. And then there is artistic freedom. Not all artists create images for Kindergartens.  It is probably the responsibility of the artist what he thinks he can/may publish where.
It's really very annoying that US companies tell artists all over the world what they can and can't do.
This isn't limited to Adobe.

 

PS: Btw, the word t-o-p-less is not allowed in this forum! ... really?, c'mon.

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

I feel extremely uncomfortable when my tools tell me what I'm allowed to do with them!

What would our culture be like if...

Michelangelo's chisels hadn't worked because their manufacturers didn't like depictions of naked men?

Sandro Botticelli's brushes had failed because they were not allowed to paint a Venus?

The tool I use judges what I am allowed to do? Really?

Is that censorship?
Any response from ADOBE? 

Votes

Translate

Translate

Report

Report
Community Expert ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

Let's not blame Adobe just yet. Remember...when technology like this comes around, pornographers are the first to jump in and use it to their advantage. That said, it has nothing to do with censorship. It's a freaking bot, not a human, making these judgement calls based on what the bot creates, not what we specifically ask for. And if you think this is bad, do a Google search for words banned on Midjourney. Like pixie, for example. Or transparent. Everybody calm down. 🙂

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

About the not being tallied by Adobe, I would like to hear that from an Adobe Rep or Adobe's legal department. Too bad that a company that makes billions a year is not able to have proper support staff and only relies on a comunity to answer questions. Not saying that you are wrong and at least thanks for responding, but Adobe has to have personnel that can answer these questions and be liable for the answer. If you are mistaken, Adobe will claim that the answer was from a community member and not an Adobe associate. Just a simple way to avoid legal liability

Votes

Translate

Translate

Report

Report
Community Beginner ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

We can blame Adobe for the improper wording. The message should them say that the image that the Adobe AI created does not meet Adobe Guidelines, but as currently worded it means that your image does not meet Adobe Guidelines. They have to fix this to avoid consumer rejection of the program. Right now only an Adobe Rep can assure me that they are not tallying the people with the guideline messages. Most likely not, because of it being a beta program, but proper assurance from Adobe staff would make me feel more comfortable to use it.

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

The guidelines don't actually say "No NSFW" they say not to use it to create pornographic or harmful content. 

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

Please don´t get me wrong. I do not blame anyone. I would like to get some answers.
Maybe it´s a freaking bot, but Adobe built it and set the rules. (As Midjourney does.)
I think there is a new universe of possibilities and we the people (and the developers?) haven't thought about the implications yet.
So it's safer for AI companies to block certain things at the moment.

Votes

Translate

Translate

Report

Report
New Here ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

Lorsque je demande remplissage génératif on me dit qu'il a effacer les suggestions car j'ai enfreint la loi sur l'utilisation. Pourtant je n'ai rien fait que de demander Générer.

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

I suspect they do track every violation, but not tally them to say 20 strikes and you're out. There is a difference between the AI getting it wrong and someone trying to abuse it. 

 

I'd like to think I can have 1000 flags on innocent images or even non-pornographic NSFW images and they will review them and go "The AI failed here", but I'd like to think if it was a clear misuse, especially such as where a minor was involved and someone was trying to turn that image NSFW, then Adobe would and should report that if their prompt was intentionally trying to create harmful content. If it was an innocent prompt that lead to a NSFW image, then that is where Adobe would be at fault and that is what they want to avoid. 

Photoshop as it stands now can be used maliciously; it is not hard to take a NSFW image and put it on a celebrities head with simple blending skills and call it a deep fake, but AI can make this task a lot easier for the layman without retouching skills. That is what Adobe is trying to protect from. 

Votes

Translate

Translate

Report

Report
New Here ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

MP323023587tj8m_0-1685715358650.png

Dont think this is a inappropriate object

Votes

Translate

Translate

Report

Report
New Here ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

Repeatedly, I have encountered situations where I am unable to utilize this technology due to the judgment of an algorithm that deems it unsuitable based on corporate guidelines. Explain to me how this is considered an improvement when we are given a tool that refuses to function. I would assume that your technology possesses enough intelligence to distinguish between what is doubtful and what clearly goes against corporate guidelines. This unfortunate outcome arises when legal departments become involved in the development of software. The face is blurred to protect the subject.

arm.jpg

 

Votes

Translate

Translate

Report

Report
Community Expert ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

@tomw20908746, ha, so glad someone got my point! I wondered if the reason “cattle stampede” threw off a warning (and why so many other seemingly innocuous prompts aren't working in GF) is because the AI is pulling mostly from Adobe Stock, and the requested imagery is either banned there or not found in the library. So, I tested that theory out this morning, but that's not the answer. To be continued…

 

adsldj70231.jpg

Votes

Translate

Translate

Report

Report
New Here ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

I want to add legs to a photo of a caterpillar, yet when I write just the prompt 'legs' it thinks I'm being inappropriate?

Votes

Translate

Translate

Report

Report
New Here ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

It blows my mind that we can use AI to generate anything to do with Alcohol but Marijuana is against guidlines. I run a weed business in a legal state (which almost half the US is now) and wanted to add a weed bud to a photo but nope cant do that! ITS AGAINST THE RULES! Because why? Weed is worse than alcohol somehow?....

 

Votes

Translate

Translate

Report

Report
Engaged ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

I too have tried numerous empty prompts and with the same results you're experiencing. Seems to me it's getting worse. The hands are just appauling most of the time as well

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

Yea. This new tool feels amazing, but with being highly restricted as it is now, I don't see myself being able to give some serious use for it other than "wow" factor for showing my friends and colleagues.

Right now I'm trying to make "sci fi soldier turned back" or "back of a sci fi soldier" and it doesn't allow me. But has no problem with the "sci fi soldier". So the problem is with "back" and most parts of the body. They are trying to avoid people using to make fake E R 0t I C images or whatever but blocking parts of the body is an overreach and removes the usefullness of the feature.
Edit: what? Why is ero itc blocked in the forums? Who runs this place?

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

Wow, more ridiculousness!

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

P*rn might be forbidden, and community standards will prevent you from say, swapping someone into lingerie or a bikini... but that won't stop the AI from getting creative on it's own.

Original and AI'd version.

A.jpg

Cyberpunk clothes 1.jpg

  

Votes

Translate

Translate

Report

Report
Explorer ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

1st prompt was "Cyberpunk Nightclub" with background selected

2nd prompt was "Cyberpunk clothing" with a box around her torso selected.

Votes

Translate

Translate

Report

Report
Engaged ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

I'd like to know why asking for a wrestler in a prompt with ( with my own head left in ) I was given the body of a scantily clad female with ginormous boobies. Same result when I prompted a Viking. Scantily clad female !!  Double standard bots if you ask me 

Votes

Translate

Translate

Report

Report
Community Expert ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

OK. I've only tried this twice, but it's been successful twice, so maybe I'm on to something but don't hold me to it, I was attempting use generative fill to remove tape and a feeding tube from the cheek of a newborn baby. This was at the request of the mom because she wanted some decent baby pictures to look back on some day without the feeding tube showing up in every photo.

I made a loose selection of the tube and repeatedly received guideline violation warnings. On a whim, I made an additional but very small selection on another area of the photo (something I could easily fix with content aware or the spot healing brush).

On my next attempt, generative fill worked flawlessly, with three examples to choose from. I tried it a second time and had the same positive results. Then I tried it again with a different image of the same baby. If I only selected the tube, I'd get warnings. When I selected another small area of the photo in addition to the main selection, it worked perfectly once again.

Votes

Translate

Translate

Report

Report
Engaged ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

Like I said in a previous reply the hands are HIDEOUS. VERY frequently the generation DOESN'T keep the shape of the selected subject. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

The only AI application to date capable of creating usable hands at least
SOME of the time is Midjourney.

Votes

Translate

Translate

Report

Report
New Here ,
Jun 02, 2023 Jun 02, 2023

Copy link to clipboard

Copied

The important thing to note, is that it's the 'results' that the the guidelines say can't be displayed.  It has nothing to do with the source image, but what images the AI finds that will fit the requirements.  For example you have a image with a cropped elbow, but to replace that elbow the best bit image the AI comes up with includes nudity, and bam you get 'guidline blocked'.  This is all due to the AI needed more training, i.e. more source material to work from.

 

The only solution for this is for the generative layer's mask to be taken into consideration by the currently overzealous censor, and only censor what will be visable.  I realise that people could then simply edit the mask to show the censored content, which is why I suggest locking the mask for any results that go against the guidelines.

Votes

Translate

Translate

Report

Report