Copy link to clipboard
Copied
I have some photographs where I would LIKE to change the shirt or clothing of some of my family members. These are MY photographs that I have taken over the years. A couple of the photographs are of shirts that are somewhat revealing in nature, NOT nudity, kind of low cut in nature. If I select the shirt with a selection tool, and try to change the shirt or top using Generative Fill, I get the dreaded Notice that my request violates the Guidelines. NOTE: I am NOT creating nudity, or removing nudity, simply trying to replace a Woman's shirt, and a Male Shirt, (Old Tank Top). am simply trying to create NEW shirts. So, I do not understand what guideline I am violating. Again, these are OLD images of mine, where I am just trying to release shirts with NEW shirts.
Now, here is the WEIRD part. After I got the guideline notice, I tried to see if I could create an entirely NEW document with Generative Fill, and I typed in, "Girl wearing Bikini walking on the Beach with the Ocean in the Background", and it created it. So, I can create a Bikini girl, but I can not replace a semi low cut shirt on a woman, or a tank top on a male. Can somebody explain a way to get around this, or will this be fixed soon?
Copy link to clipboard
Copied
I've been a photographer for over 30 years and I'm upset, as are the designers, magazines, ad agencies, and my fellow photographers and editors, that Adobe is censoring all our work.
I shoot for Vogue, ELLE, Harper's, NYFW, Couture Weeks in NY, London and Paris, and Adobe is censoring everyone's works. I take one easy step to expand a frame on a 4x3, and I'm told my work does not meet community standards.
I think the entire industry should be in an uproar over this. These are artists' tools. If you buy a tube of paint or a canvas, does not allow you to paint what you want? If you buy a notebook, does it tell you what you can write? Adobe is censoring artists and I think it's awful.
Copy link to clipboard
Copied
Adobe doesn't seem to like lots of nakedness and AI, I guess it's to stop abuse. There seems to be a number of apps with a similar approach.
Copy link to clipboard
Copied
Nobody is censoring anything. Adobe is choosing what they allow on THEIR servers and being conservative about it. If you watch the tech press, you'll see what happens without the guardrails, like Grok outputting fake nudes of Taylor Swift (just this week...)
Copy link to clipboard
Copied
I've been a photographer for over 30 years and I'm upset, as are the designers, magazines, ad agencies, and my fellow photographers and editors, that Adobe is censoring all our work.
That claim appears to be untrue. (edit: Not the claim that you are upset, which is incontestably yours to say, but the claim that Adobe censors your work.)
One can use Photoshop’s »offline« features on whatever images same as before.
That certain AI features which transfer image content to Adobe servers employ overly sensitive »filters« to determine »offensive« content is not »censorship«.
Copy link to clipboard
Copied
'If you buy a tube of paint or a canvas, does not allow you to paint what you want?'
Of course it does, as does using Photoshop on your PC. But with generative AI, you are uploading content and asking Adobe to process it and generate content for you. It can be argued that the safeguards put in place to protect their own reputation are overly sensitive but that is the company's prerogative and is not censorship of your work.
Dave
Copy link to clipboard
Copied
The meaning of words aside, @KissMyKite , is Generative Expand the only case in which you usually run up against the restictions?
Are you familiar usual work-around (covering the person with a [Solid Color] Layer temporarily which can be automated in cases like this image)?
And the fact that generative expand/fill have pixel limitations (notice the difference in the grain noise (edited) in the generated part on the left in the screenshot)?
Copy link to clipboard
Copied
I agree, I shoot for top Fashion Magazines like ELLE, Harpers, Vogue, NYFW, Sports Illustrated Swimsuit and Adobe is censoring how we can use basic tools. To just expand the frame on a 4x3 I'm told my picture does not meet community standards. Censoring how an artist can use their canvas or a tube of paint is midevil thinking. Before you sell your subscription you should thell the fashion industry, fashion magazines, health & beauty products and a host of other you reserve the right to censor their work. This is outrageous.
Copy link to clipboard
Copied
I'm a fashion photographer for ELLE, Vogue, Sports Illustrated Swimsuit, NYFW, Miami Swim Week, Couture Week in NY & Paris. Your expand option does not work for fashion. Your program say that a woman in clothing does not meet community standards. This is high fashion and Adobe has opted to become no longer a tool but a content editor. I think its disgusting
Copy link to clipboard
Copied
I work for Sports Illustrated Swimsuit, Elle, Harper's and many other fashion magazines. Your expand product is worthless to us because it does not like women.
Copy link to clipboard
Copied
I shoot for Vogue (Conde Nast), Harper's (Hearst), Sports Illustrated Swimsuit, WPP (Ad Agency) and The TV Show Dancing with The Stars (BBC Studios). Everyone from the editors to the creative directors are very frustrated with Adobe. For the first time since I can remember, one firm is changing their entire WAN/SAN edit suits for NY, LA, Paris, Spain because they feel the fashion industry has had it with Adobe. I get it and if Adobe wants to censor what you can edit they will just continue to hear people walking away. A simple pair of shorts, a skirt, a bathing suit, a backless dress an Olympic Athlete all examples of Photoshop destroying workflow. This could be Adobe's BUD LIGHT moment.
Copy link to clipboard
Copied
Tony Filson, the forum guidelines ask you to post once and then continue in the same thread. I've merged four of your duplicate posts into this thread for you.
How did you add canvas before the days of Generative AI?
Jane
Copy link to clipboard
Copied
»How did you add canvas before the days of Generative AI?«
A good question …
Anyway the workaround
Select > Subject,
Layer > New Fill Layer,
using the Crop Tool
seems to handle the second image just as well as the first one and could be recorded in an Action.
Copy link to clipboard
Copied
Yeah, that's what gets me every time...what did they do prior to may 2023 when this first appeared in Photoshop? Were they able to get any work done? 😄
Copy link to clipboard
Copied
I have cancelled my subscription to the Photo plan. Adobe violates my privacy and heavily censors generative fill. It's now the worst corporate censor and I won't be a party to that.
Copy link to clipboard
Copied
Nobody is violating your privacy. As for censorship, there is a lot of controversy right now about abusive uses of AI. I'm guessing that Adobe is erring on the side of caution. All it would take is a few accusations that they are, say, allowing users like you to generate, oh I don't know, child abusive material and the whole thing gets shut down.
Copy link to clipboard
Copied
I'm not going to engage in extended debate. Adobe's terms allow it to scan user materials. I have no reason to disbelieve someone like Tony Northrup, who has written mutiple books about Adobe products and now decries the invasion of privacy. Then there is "scraping," which is not my central concern.
In the past year, after I renewed my Photo products subscription, I haven't used generative fill on a single picture with a child in it (correction: one baby), yet the censorship notification has appeared over and over, well over a hundred times. Sometimes the problem (for Adobe) is a term, and other times the photo. Often it is something like a woman in a swimsuit or a revealing top, but never a nude. Generative fill is very prone to censor editing photos of buxom women. Yesterday, it refused to allow me to fill "holster," and then I checked and "gun" is banned. I've never been able to use "bathing suit," and I ran into a prohibition on "bat" (the animal). Is it your contention that these terms, and dozens of others that I've encountered, are censored to protect children, even when there is no child in the photo? Do you believe Adobe's method can't distinguish between a buxom adult woman and a child?
That's all I plan to say in this forum. When Adobe is employing some method of scanning my photos and censoring my editing it is invading my privacy.
[abuse removed by moderator]
Copy link to clipboard
Copied
I'm not going to engage in extended debate.
By @nicmart
And proceeds to do exactly that.
I responded to a post earlier today demonstrating how easy it is to overcome the censorship issue.
https://community.adobe.com/t5/photoshop-beta-discussions/violation-notice/m-p/14877075#M17365
Copy link to clipboard
Copied
I love the tool to select an area of my picture and use generative AI to fix or enhance it but I hate the inaccurate content restrictions. I am working on illustrating the bible, which is a huge task even with AI to help. Often I find there are problems with hands, feet, and faces. It is great to be able to fix them with the lasso tool and generative AI. The sad truth is that I am constantly being hit with guideline violations that seem to have no basis that I can see or understand. A general guideline violation message is not enough.
For example, Here is a picture I am working on. It is a scene from the bible.
There are a number of problems with the people in this scene that I'm trying to fix. For some reason, these two heads will not work with any prompt I can think of. I've tried long prompts like "a woman in a biblical scene with a serious expression on her face". I've tried short prompts like "Woman face". No matter what I try, I get the violation pop-up.
No matter what I try to do I get a guideline violation. There is no explanation of what guideline I am violating. I can't figure it out. I am constantly sending feedback on the images I work on but nothing seems to improve. In fact, it seems to be getting worse. At times I feel so frustrated that I am actively looking for an alternative to Photoshop. Has anyone had a clue about how to deal with this roadblock?
Copy link to clipboard
Copied
Hi @lespardew,
Thank you for taking the time to share feedback on this guideline violation. We've passed it along to the team for review. We'll reach out if we have any questions. Please continue to share feedback in app when you see any issues.
^CM
Copy link to clipboard
Copied
Thanks for listening. It helps to know someone from Adobe is listening. There are a lot of frustrated artists out here.
Copy link to clipboard
Copied
This issue is not resolved. To mark it as resolved is ignoring the problem. Please push this to the top of importance. Adobe is losing users over this issue. I myself am looking for a solution other than Photoshop because of this issue. I think with the advent of tools like Flux Kontext we will see a mass exiting of artists from Photoshop for AI fill work.
Copy link to clipboard
Copied
sorry you're still seeing this issue. We are working on it.
if possible, could you email me file with selection of the area you're trying to remove/fill, to imaderyc@adobe.com?
thank you!
Copy link to clipboard
Copied
Copy link to clipboard
Copied
I've had them warn me about "against community guidelines" when I simply was trying to remove a light stand from a wedding photo.
I get this warning ALL. THE. TIME, on the dumbest, most innocent stuff.
Fix a stray hair? "Goes against guidelines"
Remove a garden hose from the grass "Goes against guidelines"
Remove a wrinkle in a shirt "Goes against guidelines"
It's exhausting, and yes, I report each one and tack a photo onto it. I don't mind their "guidelines" if they'd have a better algorithm to detect what they're doing.
Copy link to clipboard
Copied
Dude - I am going to take a wild guess and say that adobe has ZERO interest in fixing this. It has become infinitely restrictive, now. I am currently working on a wedding photo with a close-up of the groom putting the wedding ring on his bride's hand, and I cannot get generative AI to do ANYTHING without "guideline violation". I was focused on the hand, but then tried other areas of the photo and get the same problem. What? Why? These are custom, super high-res quality photos my son took for a wedding. He pays for adobe and so do I (although we're both ready to quit [cursing removed]), and he asked me to help see if I could undo the blurring that occurred right at the hands, right where the actual action is.
Unfortunately, due to the lighting, he couldn't take the photo at a high enough speed to capture that and it is one of the MOST important photos.
All I am trying to do is select box the ring and see if I can get generative AI to provide a little clean-up. But - does Adobe build their AI to do ANY augmentation? Nope. That would just be a stupid idea for a photo editing software... *sigh*... all it does is generative. Fine. Except... now it won't even do that cause I get the same guideline violation prompt. I've sent in my complaint with the image. But - this is like... attempt number 56 of sending in a photo... JUST a photo... nothing promsicuous, nothing "oh noes! Art might have something suggestive??!!" (because, why should Adobe work with artists to be artists... that would just be stupid). Every photo I've sent to adobe to complain and every post I make on this forum about it - zero effort from adobe's part because nothing changes. I go back after months and see if the new photoshop updates fixed the previous problem and.... nope. I don't think adobe wants customers anymore. With so much new software out there and the $1500 my son and I are paying annually.... unless adobe wants to just lift the restrictions until they can figure out how to train their AI properly... this is becoming frustratingly useless. imho.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now