Copy link to clipboard
Copied
Hello Adobe and its collective users
I am writing to you not only as a devoted user of Adobeās suite of creative tools but also as a professional photographer whose work has been recognized and displayed in museum settings. My specialization in classic nudes has allowed me to explore the human form in a manner that celebrates beauty, form, and artistic expression. However, I have encountered a significant challenge with the AI restrictions placed on editing images that contain nudity, even when such images are created within a professional, artistic context.
As an artist whose work often involves nuanced and sensitive subjects, I understand and respect the complexities of creating ethical AI tools that serve a wide user base. However, the current limitations significantly impact my creative process and professional workflow, particularly when it comes to editing backgrounds for nude or semi-nude images. These restrictions not only prolong my work but also inhibit my artistic expression, compelling me to seek alternative solutions that may not offer the same level of quality and integration as Adobeās products.
I propose the consideration of the following points, which I believe could benefit both Adobe and its professional users:
Artistic Integrity and Professional Use: Recognition of the professional and artistic context in which tools are used can help differentiate between content that is genuinely creative and that which the restrictions aim to prevent.
Ethical Use Policy: An ethical use policy that accommodates professional artists and photographers, possibly through a verification process, ensuring that our work is not unduly censored while maintaining legal and ethical standards.
Custom Solutions for Professionals: The development of specialized software versions that allow more flexibility for editing sensitive content, with appropriate safeguards to prevent misuse.
Feedback and Advisory Panel: Establishing a panel of professionals from the art and photography community to provide ongoing feedback and insights on how Adobeās tools can better serve creative professionals.
Transparent Guidelines: The creation of clear, transparent guidelines that navigate the legal and ethical landscape, especially regarding sensitive content, to ensure users can understand and comply with Adobeās policies.
I am fully committed to engaging in a constructive dialogue and am willing to be part of a solution that respects both the creative needs of artists and the ethical considerations of digital content. I believe that by working together, we can find a balanced approach that supports artistic expression while adhering to shared values and responsibilities.
Thank you for considering my perspective on this matter. I am hopeful for an opportunity to discuss this further and explore how we can make Adobeās tools even more inclusive and accommodating for professional artists and photographers. Steven Williams
Copy link to clipboard
Copied
Can you use Remove Background, and set her against a rebuilt background? We used manage this stuff just fine before we had Gen Fill.
Copy link to clipboard
Copied
I'm curious if Adobe actually screens their images, which would make having a "safe search" button purposeful. You might as well remove it, because I can tell you, it ain't working. If you need our help, even though we have work to do and aren't getting paid to make Adobe appear professional, it would be nice to have a button on each image enabling us to report inappropriate imagery to you, so that you can then flag these images. I'm sure I am not the first person to say something, so please, since we are paying for your service, please listen to us and do something about it. I know some women don't mind exploiting themselves, but I'd rather not see it since, being a woman, I think women are worth so much more than their bodies.
Thank you...
Copy link to clipboard
Copied
Hi @Cyndi25639385sikt,
I appreciate your concerns, you are not alone and our Search team is continually improving and updating what is included or excluded in safe search.
If you can provide details, either here or send it to me in a private message I can forward this to the appropriate team for review.
Copy link to clipboard
Copied
I looked up "beauty black background" to find a beauty shot for a salon, specifically a salon hairstyle with a dark background. I see several images of women in sexual poses wearing strings and one photo of completely exposed breasts. If that's specific enough for Adobe. All you have to do is a search for beauty images and there you go. It's been going on for years and other people have made complaints. I've even seen a couple in sexual position with hardly a scrap of "clothing" before. I also did a chat with a rep and got the usual, sorry, we'll look into it. I know that Adobe may have limited employees at this time, but if there is way to prevent these images from even being uploaded in the first place, that would be easier for you to deal with. I understand that I do not know the process by which Adobe gets images from stock photographers, but IMO, they shouldn't be able to upload anything without it being approved first. If this is not doable, then there needs to be a way for us to flag images so they are immediately removed from safe search.
Copy link to clipboard
Copied
I've forwarded your complaint to the appropriate team for review.
Copy link to clipboard
Copied
I understand that I do not know the process by which Adobe gets images from stock photographers, but IMO, they shouldn't be able to upload anything without it being approved first. If this is not doable, then there needs to be a way for us to flag images so they are immediately removed from safe search.
By @Cyndi25639385sikt
I do not know how it worked before Adobe took over Fotolia, but Adobe is checking images before they are open for sale in the database. I suppose that this is also the moment where the moderator needs to set the "safe search" flag. (As a side note: there is also a check that the model is not underage, and the photographer needs to submit a model release.)
https://helpx.adobe.com/stock/contributor/help/the-review-process.html
Copy link to clipboard
Copied
Okay, if each and every photo is being seen and approved for sale before release, then whoever is responsible for marking them as "safe search" is not doing their job. So that narrows it down and makes the solution simple. Whoever is responsible for screening the images, when they see "sexual" or "no clothing", that's a clue to remove it from the safe search list.
FYI, I just did another test with the word "murder" and the word "pornography" in the so-called safe search. Yeah, those aren't being filtered, either. There's really no excuse if an Adobe human employee truly screens each and every photo. You can't miss the images I've seen, if you are looking at them before you approve them.
Copy link to clipboard
Copied
You can also add "r.a.p.e." to the list. The mark just keeps getting lower for what people think is work-place professional or decent for children to see. I understand most designers won't be looking up these words, but a child might. And if you have junk like this pop up in benign searches, it makes it more of a problem. Hilarous, I just had to retype this post because the word "r.a.p.e." is not permitted in this community, but it's "okay" to have the graphic images of this topic and sexist and pornographic imagery in your database. Incredible.
Copy link to clipboard
Copied
I noticed that Photoshop. just like many places in AI, does not support nudity. That's a huse mistake. Photoshop is used in art with nudity all the time. I am sure the root of this has to do with the AI community but i would advise Adobe to re-consider.. That are many tools now which ignore the AI community guidelines on nudity.
Copy link to clipboard
Copied
Sorry. deplorable grammar here, ( two mis-spellings) It is a "huge" mistake to govern nudity and "there" are many tools which now support nudity. Why are we allowing AI to govern our artistic instincts? Come on now.. Don't restrain artists and the very basic level of using Photoshop.
Copy link to clipboard
Copied
@sjedens This has been discussed multiple times in the forum. My question to you is if it was allowed, what would prevent someone from using this to create nude images using another person? As you stated, this is the route many AIs have taken to prevent misuse.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
Adobe is widely used in educational and business settings. They've made a choice to prevent misuse/abuse and train on licensed models to prevent liability.
If you are working with nudity - there are ways around existing models in Photoshop -
If you are trying to generate nudity - you're better off looking at other solutions.
Copy link to clipboard
Copied
Your analogy is flawed. It's not the screwdriver responsible in this case. If a car manufacturer adds a feature that can intentionally cause harm both the driver and the car manufacturer would be at fault.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
@sjedens gun manufacturers can be sued if their product has a flaw or feature that causes harm to the user. Thats different than if the weapon is used in a crime. Not going to debate semantics with you all day - the reality is Adobe has chosen this path for thier AI. There are other options out there if your livelyhood depends on nudity.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
I am trying to use generative AI on pictures containing nudity (covered nudity), not eroting in nature etc. only to expand background, why cannot I do this?
As I understand, adobe rules is as follows:
Do not use Adobeās generative AI features to attempt to create, upload, or share abusive, or illegal, or content that violates the rights of others. This includes, but is not limited to, the following:
And I am trying to do none of those things. Why is adobe assuming I am trying to create or upload abusive material just because it contains naked skin (not even exposed nudity)? I am not even trying to alter the person in the photo in any way but only to expand the damn background. The photo taken during photoshoot on mututally agreed terms.
So effectively we cannot use generative fill as long as adobe detects any kind of exposed skin present on the photo?
Copy link to clipboard
Copied
Ai is under intense scrutiny and getting bad press from media and governments, so Adobe are being careful not to be seen promoting the misuse of Ai generated images.
Copy link to clipboard
Copied
I understand where Adobe stands on this issue but hopefully, they will at some point loosen this up at some point.
Copy link to clipboard
Copied
So effectively we cannot use generative fill as long as adobe detects any kind of exposed skin present on the photo?
If you just want to extend background just make a duplicate, paint over the part thatās probably the trigger of the issue and expand away.
Then duplicate the result to the original image and mask it.
Copy link to clipboard
Copied
I generated AI images in fashion segment. Unfortunately, I suspect that on the Internet and specifically on Adobe there may be different concepts about permitted nudity in advertising photography. So what are the rules regarding nudity? I have those questions.
Can you be specific about the categories?
1. Female nipp
2. Male nipp
3. Girl in bikini or underwear (regular design, nothing visible, just sport style)
4. A girl in slightly see-through underwear (skin visible but not nipp, lace is always see-through, I repeat that we are talking about an advertising photo for clothes)
5. A girl in see-through underwear (where you can see the kinds outline of the nipp, but nothing at the bottom of course). For example bra / or crop top but without bra, theres nipp shape
6. A girl with a completely bare breast, but with some kind of decorative detail covering the nipp for 100%. So the entire shape of the breast is visible, but there is a decorative heart or mini bikini on the nipp.
7. The naked back of a woman (from behind, but it is clear that she is not wearing a bra, bottom is normal clothing, for example, imagine a photo of a back massage).
8. There are different policies in photography about what the bottom of pantie should look like. If we are talking about advertising of the 00s, then there is, letās say, a straight line (which in reality does not exist in life almost in any clothings types). If we are talking about advertising in the 2020s, then there is already a more realistic picture with a visible form. What does Adobe think about this?
Each question talks about advertising of swimsuits, underwear, summer clothes in different forms (crop tops, bikini tops, mini-bras). In general, there are two categories: romantic lingerie for 30s 40s women / and bright, rich summer colors for 20s 30s girls.
Thank you in advance!
Copy link to clipboard
Copied
You are not addressing Adobe directly here in this forum, and will not receive a response to your long list of questions. Have you read the Adobe Stock guidelines in nudity? Here's the text:
"Nudity
For submissions containing nudity that has artistic value, the model must be at least 18 years old and the model release must include the model's photo ID so we can verify age. Never submit any sexually explicit, pornographic, or immoral material, including material that sexualizes minors."
I would summarize "nudity that has artistic value" as follows: no genitalia, no nipp les, , no sexually suggestive or exploitive poses.
Copy link to clipboard
Copied
Mmm, I'm new one here, I didn't know that. I thought the old accounts here work in Adobe. At least that's how they communicate and try to look alike sometimes š And yes, thank you very much for the answer
Copy link to clipboard
Copied
The forum members that have the "Community Expert" badge are Contributors, just like you; however, we also have significant experience in using the forum and with Adobe Stock T&C's, guidelines and procedures. We are NOT Adobe employees and have no insider knowledge beyond the policies and guidelines that have been publicly issued by Adobe.