Copy link to clipboard
Copied
I have some photographs where I would LIKE to change the shirt or clothing of some of my family members. These are MY photographs that I have taken over the years. A couple of the photographs are of shirts that are somewhat revealing in nature, NOT nudity, kind of low cut in nature. If I select the shirt with a selection tool, and try to change the shirt or top using Generative Fill, I get the dreaded Notice that my request violates the Guidelines. NOTE: I am NOT creating nudity, or removing nudity, simply trying to replace a Woman's shirt, and a Male Shirt, (Old Tank Top). am simply trying to create NEW shirts. So, I do not understand what guideline I am violating. Again, these are OLD images of mine, where I am just trying to release shirts with NEW shirts.
Now, here is the WEIRD part. After I got the guideline notice, I tried to see if I could create an entirely NEW document with Generative Fill, and I typed in, "Girl wearing Bikini walking on the Beach with the Ocean in the Background", and it created it. So, I can create a Bikini girl, but I can not replace a semi low cut shirt on a woman, or a tank top on a male. Can somebody explain a way to get around this, or will this be fixed soon?
Copy link to clipboard
Copied
Bottom line we can complain all we want. Adobe doesn't care, they think they are god and just sit there and say [removed by moderator].
Copy link to clipboard
Copied
None of these complaints are going to do anything. Adobe is a corporation. In the United States of America, corporations have more rights than the actual human beings do. If Adobe wants to do something that we might call censorship, they can do it, and no one has any recourse, aside from ditching Adobe. I don't use AI much, so this hasn't really affected me (yet), but who knows where this ends up? In this country, where we're busy whitewashing and sanitizing our history and what used to be our very imperfect, but shared culture, Adobe is simply following the trend started by the powers that be.
Copy link to clipboard
Copied
Uhh... I wanted to see if anything had changed on this issue. I was working on some bikini and lingerie photos one day and it was a nightmare because I thought I'd get over the task quickly with Photoshop's new features and it turned out that I had to sit through overtime because of this messed up censorship... Of course, I canceled my subscription. Oh well, I'm back to Affinity and Flux AI.
Copy link to clipboard
Copied
Adobe AI is plain garbage nowadays. It was pretty good when firstly introduced, but now it is plagued by Adobe prudish and brainless censorship of everything that shows a bit more of nude skin. I don't understand if Adobe aknowledge how infuriating this is for everyone that uses its tools to process and enhance people photos and how badly it has affected the quality of the outputs. As a photographer, after several years of using Adobe products, I am now in the process of canceling my subscription at its annual expiration.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
I think Adobe should stop being Big Brther and police our creativity. As a professional fashion and boudoir photographer I frequently have images that include sheer or nude models. I also, being human, sometimes fail at framing the shot to perfection and need to generative expand tool to recompose my image, except if Adobe AI detects any kind of nudity it REFUSES TO EXPAND THE BACKGROUND. The nudity is NOT part of what is being generated, so Adobe is NEEDLESSLY and effectively censoring the content we are able to create. We are adults, and do not need babysitting or "protection" from the adult content we PERSONALLY create.
Copy link to clipboard
Copied
That's interesting that PS does that. I would agree that your situation should be affected by Adobe's rules. This isn't whatt you were really asking about, but a workaround might be to create a duplicate document and remove the nudes, leaving just the background, then expand the background, and finally putting the nudes back in.
Copy link to clipboard
Copied
Given the current firestorm around AI, this isn't surprising. Use a different tool.
Copy link to clipboard
Copied
I'm looking for one.
Copy link to clipboard
Copied
Creativity should be celebrated not censored. Allowing full expression helps foster innovation and new ideas.
Copy link to clipboard
Copied
but a workaround might be to create a duplicate document and remove the nudes, leaving just the background, then expand the background, and finally putting the nudes back in.
Covering up the elements with a Solid Color Layer before expanding should suffice.
Copy link to clipboard
Copied
I do reconstructions of historical figures and when I try to use the generative AI it's censoring me so constantly it's practically useless. It won't let me use prompts as innocuous as "torso" or "Roman slave", it won't let me erase heads or fill in legs over the knee.
I tried to show a bare lower thigh on an Otzi the iceman reconstruction and it bawled me out over that.
Now I understand that they don't want to be involved in making p**n, but this is unsexy, fully clothed, uncontroversial, historical illustrations they're stopping.
It's so overly censorious it's useless. I won't bother subscribing for another month. My old copy of CS5 still works fine.
Edit: It actually blocked me from using the word p**n legitimately in my post. I had to censor that too. Not even the Victorians were this ridiculous. Adobe hates art.
Copy link to clipboard
Copied
Hi @Philip33122994yy1c its not censoring based on your image - its giving violations for the terms you are using. "Slave" is most likely a flagged term.
Copy link to clipboard
Copied
Flagging terms isn't good though. Say the term "hit little girl with a car" is locked out in order To prevent it from drawing a picture of something like that for someone morbid. Flagging the term could prevent someone from helping if it Ignores "what to do if someone" prior to the flagged language.
Copy link to clipboard
Copied
Flagging terms isn't good though. Say the term "hit little girl with a car" is locked out in order To prevent it from drawing a picture of something like that for someone morbid. Flagging the term could prevent someone from helping if it Ignores "what to do if someone" prior to the flagged language.
By @Josh35231743eutl
Under which circumstances would generative AI (that is intended to create images, not provide medical advice) need to process »what to do if someone hit little girl with a car«?
Copy link to clipboard
Copied
@Philip33122994yy1c , you are free to continue using the perpetual license on your current set-up as long as your OS, hardware, drivers etc. can support it. (edited)
And if Adobe’s Firefly/Generative Fill do not live up to your expectations you can post/support Featuture Requests on the issue and/or use some of the other available services or do your image editing the same way people did before Generative AI was available.
But please don’t post nonsense like
Adobe hates art.
on this Forum.
Copy link to clipboard
Copied
Issue: Generative AI feature blocking attempts to sanitize/remove/censor explicit content from images
Photoshop version: Adobe Photoshop 25.4.0
OS: Windows 11 64-bit, Version: 10.0.22000.2538
Steps to reproduce:
1. Open Adobe Photoshop.
2. Load an image containing explicit content that needs to be sanitized or censored (ie. covering explicit areas such as nudity with clothes/drapings/leaves)
3. Select area to be santized with lasso tool.
4. Select "Generative AI" option.
5. Attempt to use the generative AI feature to sanitize or remove explicit content from the image.
Expected result: The generative AI feature should process the image and successfully sanitize or remove explicit content, making the image suitable for appropriate viewing.
Actual result: The generative AI feature blocks the attempt to sanitize or remove explicit content from the image, indicating that such attempts are forbidden by the rules (False, the rules state that I should not attempt to CREATE, UPLOAD, or SHARE abusive, or illegal, or content that violates the rights of others, I'm attempting to REMOVE such content, I'm NOT creating it, it's already there and I want it GONE)
Recommendation: Review the blocking mechanism of the generative AI feature to differentiate between attempts to create explicit content and attempts to sanitize or remove explicit content to align with user intentions (ie. specific keywords such as sanitize/remove/censor). Drawing a parallel with the content censorship approach employed by platforms like 4kids TV, consider allowing users to utilize the generative AI feature for sanitization purposes without being blocked.
[link removed by moderator]
Copy link to clipboard
Copied
I'm a professional photographer specializing in boudoir, maternity, and glamour shoots, and I rely heavily on your software to retouch my clients' portraits. However, your content filters on the generative AI edit feature are making it impossible to do this quickly and efficiently. 90% of my generative fill requests are getting blocked, significantly prolonging my editing process. This is extremely frustrating.
I can't fix skin, hair, or anything else that touches skin. I can't do simple outfit adjustments, and even when I'm trying to fix backgrounds, if they're skin-colored, they get flagged. This is absolutely unacceptable. I'm not shooting pornography, making requests to create nudity, or doing anything inappropriate. I'm simply trying to retouch my images.
I'm growing increasingly disappointed in your software. I need quick and precise edits for my work, and these constant errors are severely impacting my productivity.
As a company that develops software specifically for photographers like myself, you should understand how critical this functionality is for us. Please address and resolve this issue as soon as possible. This can't keep happening. DO SOMETHING. I'm literally begging you.
Signed,
A photographer who pays you a ton of money every year to be able to count on your software to work for my business.
Copy link to clipboard
Copied
Same I am starting to have issues now that nothing in this feature will work and its simple things like removing or adding trees I updated the app and still didnt fix it
Copy link to clipboard
Copied
@Kra_Ash , please post meaningful examples/screenshots to illustrate your specific issues.
Copy link to clipboard
Copied
By @Retouching Angels90% of my generative fill requests are getting blocked, significantly prolonging my editing process.
Since 90% of your GF is blocked, you might want to review the guidelines to avoid future issues and see what is not permitted.
https://www.adobe.com/legal/licenses-terms/adobe-gen-ai-user-guidelines.html
In addition, the best was to give feedback to Adobe is to click "Provide Feedback" in the alert.
when I'm trying to fix backgrounds, if they're skin-colored, they get flagged.
By @Retouching Angels
What color do you mean when you say "skin-colored"?
Jane
Forum volunteer
Copy link to clipboard
Copied
Any of those really. If a photo seems to show too much skin (bathing suit) or if you are using generative fill to edit clothing on a person and your selection comes near the skin it will reject it. Trying to use generative fill to match unequal skin color, whiteout areas, shadows, etc. and it is on the upper chest or legs will prompt the infamous orange warning to read the rules which we have all done and are not violating.
Copy link to clipboard
Copied
Yes. the full spectrum we mean. I have even had issues with wedding photos. Apparently adobe thinks they are showing too much leg. Remember the thumb rule. If your skirt is shorter than that. UNEXCEPTABLE. Please stop treating us like children. At least those of us ADULTS not on an EDU license. Or soon we will all be in painting with Comfy ai and Flux Lora etc. Adobe. NOW IS NOT THE TIME TO DROP THE BALL. This AI market will own you hard!
Copy link to clipboard
Copied
It is literally ridiculous. Had someone ask me to remove a tattoo off of a foot photo for their nail salon. The pic is literally two feet. Generative fill blocked it. Everytime it blocks I send a report. I'm starting to spam them with every single block, telling the how ridiculous this is.
Copy link to clipboard
Copied
Yes. the full spectrum we mean. I have even had issues with wedding photos. Apparently adobe thinks they are showing too much leg. Remember the thumb rule. If your skirt is shorter than that. UNEXCEPTABLE. Please stop treating us like children. At least those of us ADULTS not on an EDU license. Or soon we will all be in painting with Comfy ai and Flux Lora etc. Adobe. NOW IS NOT THE TIME TO DROP THE BALL. This AI market will own you hard!
By @Craig33807953z51d
First off: Did you mean »Unacceptable«?
Secondly: You are not talking to Adobe; Feature Requests (Idea) and Bug Reports are registered at Adobe, otherwise this is essentially a user Forum.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now