Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
306

P: Generated images violate user guidelines

Community Beginner ,
May 23, 2023 May 23, 2023

Bunny.png

image (1).png

 

So as you can see, it's a PG-13 relatively inoffensive image of a woman in a bunny outfit. The top worked fine, and I was able to complete the top ear, which is cool. When I tried to extend the bottom with generative fill, though, I got this warning. They're just a pair of legs wearing stockings, and I wanted to extend it.

It feels like a false flag - though I could be wrong? I find myself thinking it would do the same for women in swimsuits.

Figured I'd share here.

Bug Started Locked
TOPICS
Desktop-macOS , Desktop-Windows
293.4K
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , Nov 10, 2023 Nov 10, 2023

Dear Community,

On November 7th, 2023, the Firefly for Photoshop service was updated and improved for this issue. You should encounter fewer guideline errors when working on or near skin-tone areas that do not violate the community guidelines.

While the improvement is a big step in the right direction, we are continuing to explore new ways to minimize false-positives. Please continue to give us feedback on this new forum thread and also report false violation errors in the application.
Thank you

...
Translate
replies 1389 Replies 1389
1,375 Comments
Community Expert ,
Aug 21, 2023 Aug 21, 2023

Same here, even using a "." as a prompt. Solution: I covered up the "offending" parts on separate layers and it worked. I'm assuming, at least, that you were attempting to removed the prisms.coverup 1.jpgcoverup 2.jpg

---------------------------------------------------------------------------------------------
Community Volunteer | I don't make the rules; I just try to explain them.



--------------------------------

Why did Little Miss Muffet step on the spider? Because it got in her whey.
Translate
Report
New Here ,
Aug 21, 2023 Aug 21, 2023

When it's not coved sometimes nothing works 

Translate
Report
New Here ,
Aug 22, 2023 Aug 22, 2023

do understand that no company wants illegal things done in on the web. But I am not sure if it is okay if Adobe will censor my stuff that I create within Photoshop, especially if it is done locally on my computer.

 

And to top it off I have no idea WHAT is/was wrong with this. I only wanted to expand an image as usual and got greeted by a short info popup telling me that it's against the rules. 

 

?!

Here is a short video snippet of what happened: 
https://share.cleanshot.com/jrbWjprtzDkxmrTDXNFS 

Translate
Report
New Here ,
Aug 22, 2023 Aug 22, 2023

I work on a lot of swimwear images and been having the same issue.  I actually got around it one day by selecting the model, telling the generative content to add a coat, and then did the generative expand and then removing the coat layer.  Crazy.

Translate
Report
Community Beginner ,
Aug 22, 2023 Aug 22, 2023

the regenerative expansion is giving a bug all the time, when I generate only a part or on the four sides of an image the tool says the following problem: ''Images were removed because they violate user guidelines.'' even if they have their own authorship and was developed by us, as well as an example in this attached image designed by us and with our own rights.

This issue started to pop up more frequently after today's update on the date August 22, 2023.

Translate
Report
Engaged ,
Aug 22, 2023 Aug 22, 2023

I've founds prompting for swimming costume nearly always works, even bikinis pop up frequently

Translate
Report
Engaged ,
Aug 22, 2023 Aug 22, 2023

I got that simply trying to remove trees earlier 😢😥

Translate
Report
New Here ,
Aug 22, 2023 Aug 22, 2023

Perfettamente d'accordo con il post.

Translate
Report
New Here ,
Aug 22, 2023 Aug 22, 2023

First time I am using Generative fill in photoshot beta (25.0.0).  I am trying to fill a dark section of a wall and hoping it will lighten the wall.  I get an error that I am violating guidelines 

Translate
Report
Community Expert ,
Aug 22, 2023 Aug 22, 2023

@Dan31823407zt64 please read the thread - enter a prompt, even a period to avoid the error.

Translate
Report
Community Beginner ,
Aug 22, 2023 Aug 22, 2023

Genereally, AI in Photoshop Beta is ok, but only sometimes and only for some type of users. For fine art nude photographers it is completely useless and iritating. Edward Weston, Alfred Stieglietz, Bill Brandt and a crowd of other fine art nude masters, whose work appears on the first page of any textbook for freshmen at any art acadamy, would never had a chance to use this feature - as they would violate Adobe's user guidelines. 

Translate
Report
Explorer ,
Aug 22, 2023 Aug 22, 2023

add a "." to fix it! Or better yet, use a different Ai. Adobe is desperately trying to get ahead of the curve and has started too late. Even if they fail to be decent at creating a photography ai tool, they will still charge the same amount of money. They don't listen to complaints particularly well and rely on Adobe Stans to defend them in support groups. Bugs, broken tools and crashy releases for months or years have become the norm. 

Translate
Report
Contributor ,
Aug 22, 2023 Aug 22, 2023

Prompt "3+1=4 written in the sand" violated user guidelines. I'm so sorry Adobe, I didn't see anywhere in the guidelines that we can't generate letters in sand.

Translate
Report
Community Beginner ,
Aug 22, 2023 Aug 22, 2023

I can not remember a time in the last 5 years that I was more frustarted than I have been in the last few weeks trying to use the Generative Fill tool in Photoshop. Every second thing I try gives the "violates terms and conditions" message when I can easily do the exact same prompts and keywords in other platforms (for example MidJourney). Here are a couple of examples:

1. I have a young family friend who loves all things military and army related so I was doing up a cool poster of him dressed in military uniform with a cool epic battle scene behind him (rubble, military tanks, explosions, etc.) In the end I gave up and just put him against a bunch of trees in the background because I was ready to pull my hair out with (cursing removed) Photoshop!!! Every single word I tried was banned (tank,military, historical military vehicles, war scene, etc.)

2. I was making a funny photo, sort of like a meme, with 2 groups of teachers fighting in a classroom (like a battle scene) and every (cursing removed) prompt I tried failed, even things like (destruction, hole in wall, fire, explosion, damaged building, etc.)

Adobe, for the love!!! get your stupid act together and figure out how to moderate prompts properly!!! AARGGGHHH! When MidJourney thinks you've prompted something inapporpriate it actually lets you send the prompt to 2nd tier moderation bot and further checks it for problems. 95% of the time the prompt then gets through with no further problems.

 

SOOOO FRUSTRATED! 😞

Translate
Report
New Here ,
Aug 22, 2023 Aug 22, 2023

I share your frustration. I'm a Product Designer and a big part of my work is Halloween costumes. I have to modify or add details based on what the client wants. Sometimes, when I work on any areas that show skin, like a wrist, a neck, or legs, I get the “violating guidelines” message. It’s so annoying that I have to use other AI sources to achieve the design I want.


Note: I noticed that using the word “model” in the Generative Fill can help avoid those “violation” alerts. For example: “Add a leather bracelet around the model’s wrist” usually works. More precise prompt descriptions tend to make Photoshop loosen its grip on censoring a little. 

Translate
Report
Explorer ,
Aug 23, 2023 Aug 23, 2023

I've not used any other AI art 'generators' apart from trying a few free web-based ones and Bing AI chat. I have seen results in other programs like Stable Diffusion and questioned the quality of PS results myself - but I was thinking, do any of those other programs allow users to select areas of images and generate areas based off text prompts and/or just AI's assessment if left blank?

I assumed that in the others that if the user wants to change things they have to do it all by text prompts without selecting certain areas that they want modifying first e.g. 'move right arm above head, change dress to red, replace shoes with sneakers' etc.

Also, do any of them allow to work in layers where you can mix things up and then chop and change easily during the process? And/or incorporate your own images, real or AI?

So, I'm not stanning for Adobe here, but coming from a place myself where I've realised that Adobe and PS may have a unique tool here - but I could be wrong, which is why I'm asking.

Translate
Report
Engaged ,
Aug 23, 2023 Aug 23, 2023

I've tried a few including mid journey and stable diffusion. Mid journey humans are far better, but, IMO adobe AI generated one have a far more natural look. Even stable diffusion have some terribly distorted  limbs & faces.

Translate
Report
Explorer ,
Aug 23, 2023 Aug 23, 2023

But can you select specific areas with a selection tool in those program's images and make amendments? Or do you have to use a text prompt to tell them where to make the change, and it has to correctly identify that area e.g. a left arm, or right ear, floor? 

Translate
Report
Community Beginner ,
Aug 23, 2023 Aug 23, 2023

I am under the impression that the generative filler feature is only for children between the ages of 4 and 12. I really don't understand this completely nonsensical restriction by Adobe. I have experienced that sometimes just typing a dot instead of a prompt will generate the fill.

Translate
Report
New Here ,
Aug 23, 2023 Aug 23, 2023

It appears that the word "symmetrical" is a prompt that violates the guidelines. I am simply trying to put in a set of steps into this image, but that keyword seems to flag it every time. This has been happening a lot recently with benign, simple things that do not go against anything in the guidelines. I think it's time for a review of the PSGF's censorship algorithm.   

Translate
Report
Engaged ,
Aug 23, 2023 Aug 23, 2023

I don't find it slow, but it's far worse at removing objects and adding objects that just aren't anything if you know what I mean ?  I asked for a patio and it added some gawdawful monstrosity that would NOT move even after deleting the layer. I had to scrub the project I was working on 

 

Translate
Report
Engaged ,
Aug 23, 2023 Aug 23, 2023

Red, try using swimming costume instead of swimwear. Works for me 

Translate
Report
Explorer ,
Aug 23, 2023 Aug 23, 2023

I've been trying to add a fringe (what we in the UK call them - in the US it's 'bangs' I think) to a woman's hair. I've tried in the web version of text to image, then tried in the web version of generative fill, and now I'm using PS Beta 25.0.

If I select the forehead and ask for 'fringe' it gives me all sorts of weird ornaments, straps, bows and indecipherable things, but if I ask for 'bangs' it gets blocked as a violation! Not my fault the US decided to call a part of a hairstyle 'bangs' (no logic to that name that I can see whatsoever). Bangs is either violent as a gun or bomb 'bangs' or is sexual in Adobe's mind it would seem, and not possibly used in any other innocent context.

I've tried 'hair fringe', 'hair bangs', 'straight bangs' and more - only the words using bangs has something resembling hair and not objects, but they are usually parted and don't cover the forehead.

This is going to be confusing and unintuitive as heck if words don't give an obvious result (I'd say it should be obvious if the AI is smart enough to recognise that the area I've selected is on someone's head at least, and surrounded by hair) or the words we need to use are getting blocked because they can also be used in an 'inappropriate' context. 

Solution: don't block so many words. 

Better solution: don't block anything and let people create what they want and then they can later suffer the potential consequences with wherever they use the image next - just as they would with painting, drawing, sclupting, photography or any other form of art that springs to mind.

Translate
Report
New Here ,
Aug 23, 2023 Aug 23, 2023

Attempted to use the generative expand function to increase the size of an image of billowing smoke so I could have it match the size of another file I wish to use it in. Using the crop tool to expand the background size and hitting "generate" does not give satisfactory results. Then I used the generative prompt field in the properties tab that pops up, and the word "smoke" triggers a TOS violation message. I suppose this is due to an association with the violence and/or illegal activities or goods portions of the policies, yet there's plenty of reasonable cases where an artist would want to use smoke in a project that would not violate TOS. Examples that come to mind off the cuff would be anything relating to cars or auto sports, use for a dramatic movie-style poster, or artwork involving fire fighters and other first responders. Could also be used to provide a base texture to be significantly modified through filters and other artistic expressions.

  1. Version of the app: Photoshop Beta 25.0.0
  2. OS: Windows 11 Home, Version: 22H2, OS build: 22621.2134, Experience: Windows Feature Experience Pack 1000.22659.1000.0
  3. Basic steps to reproduce the problem: use generative expand prompt "smoke" as shown in the attached image
  4. Expected result and actual result: allow the feature to function with the prompt provided
Translate
Report
Engaged ,
Aug 23, 2023 Aug 23, 2023

Do a google image search for smoke with transparent background, simply save the pic, copy & paste into your image.

Translate
Report