• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
68

Nudity and Semi nudity using AI and its imposed restrictions.

Community Beginner ,
Feb 26, 2024 Feb 26, 2024

Copy link to clipboard

Copied

Hello Adobe and its collective users

I am writing to you not only as a devoted user of Adobe’s suite of creative tools but also as a professional photographer whose work has been recognized and displayed in museum settings. My specialization in classic nudes has allowed me to explore the human form in a manner that celebrates beauty, form, and artistic expression. However, I have encountered a significant challenge with the AI restrictions placed on editing images that contain nudity, even when such images are created within a professional, artistic context.

 

As an artist whose work often involves nuanced and sensitive subjects, I understand and respect the complexities of creating ethical AI tools that serve a wide user base. However, the current limitations significantly impact my creative process and professional workflow, particularly when it comes to editing backgrounds for nude or semi-nude images. These restrictions not only prolong my work but also inhibit my artistic expression, compelling me to seek alternative solutions that may not offer the same level of quality and integration as Adobe’s products.

 

I propose the consideration of the following points, which I believe could benefit both Adobe and its professional users:

 

Artistic Integrity and Professional Use: Recognition of the professional and artistic context in which tools are used can help differentiate between content that is genuinely creative and that which the restrictions aim to prevent.

 

Ethical Use Policy: An ethical use policy that accommodates professional artists and photographers, possibly through a verification process, ensuring that our work is not unduly censored while maintaining legal and ethical standards.

 

Custom Solutions for Professionals: The development of specialized software versions that allow more flexibility for editing sensitive content, with appropriate safeguards to prevent misuse.

 

Feedback and Advisory Panel: Establishing a panel of professionals from the art and photography community to provide ongoing feedback and insights on how Adobe’s tools can better serve creative professionals.

 

Transparent Guidelines: The creation of clear, transparent guidelines that navigate the legal and ethical landscape, especially regarding sensitive content, to ensure users can understand and comply with Adobe’s policies.

 

I am fully committed to engaging in a constructive dialogue and am willing to be part of a solution that respects both the creative needs of artists and the ethical considerations of digital content. I believe that by working together, we can find a balanced approach that supports artistic expression while adhering to shared values and responsibilities.

 

Thank you for considering my perspective on this matter. I am hopeful for an opportunity to discuss this further and explore how we can make Adobe’s tools even more inclusive and accommodating for professional artists and photographers.    Steven Williams 

Idea No status

Views

54.3K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 2 Correct answers

Community Expert , Jun 12, 2024 Jun 12, 2024

@Dalvidos Similar requests have been made and each time users are referred back to the terms of use outlined by Adobe.

https://www.adobe.com/legal/licenses-terms/adobe-gen-ai-user-guidelines.html

 

Votes

Translate

Translate
Community Expert , Jun 04, 2024 Jun 04, 2024

Adobe is widely used in educational and business settings. They've made a choice to prevent misuse/abuse and train on licensed models to prevent liability.

If you are working with nudity - there are ways around existing models in Photoshop -

  1. Duplicate the layer. Hide the original Layer.
  2. Paint over the "offensive" areas covering up any triggered items. 
  3. Select and generate.
  4. Turn off the painted layer once you have your generation.

If you are trying to generate nudity - you're better off looking

...

Votes

Translate

Translate
replies 219 Replies 219
219 Comments
Community Beginner ,
Aug 25, 2022 Aug 25, 2022

Copy link to clipboard

Copied

I'm curious if Adobe actually screens their images, which would make having a "safe search" button purposeful. You might as well remove it, because I can tell you, it ain't working. If you need our help, even though we have work to do and aren't getting paid to make Adobe appear professional, it would be nice to have a button on each image enabling us to report inappropriate imagery to you, so that you can then flag these images. I'm sure I am not the first person to say something, so please, since we are paying for your service, please listen to us and do something about it. I know some women don't mind exploiting themselves, but I'd rather not see it since, being a woman, I think women are worth so much more than their bodies.

Thank you...

Votes

Translate

Translate

Report

Report
Adobe Employee ,
Aug 26, 2022 Aug 26, 2022

Copy link to clipboard

Copied

Hi @Cyndi25639385sikt,
I appreciate your concerns, you are not alone and our Search team is continually improving and updating what is included or excluded in safe search.

 

If you can provide details, either here or send it to me in a private message I can forward this to the appropriate team for review.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Aug 26, 2022 Aug 26, 2022

Copy link to clipboard

Copied

I looked up "beauty black background" to find a beauty shot for a salon, specifically a salon hairstyle with a dark background. I see several images of women in sexual poses wearing strings and one photo of completely exposed breasts. If that's specific enough for Adobe. All you have to do is a search for beauty images and there you go. It's been going on for years and other people have made complaints. I've even seen a couple in sexual position with hardly a scrap of "clothing" before. I also did a chat with a rep and got the usual, sorry, we'll look into it. I know that Adobe may have limited employees at this time, but if there is way to prevent these images from even being uploaded in the first place, that would be easier for you to deal with. I understand that I do not know the process by which Adobe gets images from stock photographers, but IMO, they shouldn't be able to upload anything without it being approved first. If this is not doable, then there needs to be a way for us to flag images so they are immediately removed from safe search. 

Votes

Translate

Translate

Report

Report
Adobe Employee ,
Aug 26, 2022 Aug 26, 2022

Copy link to clipboard

Copied

I've forwarded your complaint to the appropriate team for review.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 26, 2022 Aug 26, 2022

Copy link to clipboard

Copied

quote

I understand that I do not know the process by which Adobe gets images from stock photographers, but IMO, they shouldn't be able to upload anything without it being approved first. If this is not doable, then there needs to be a way for us to flag images so they are immediately removed from safe search. 


By @Cyndi25639385sikt

I do not know how it worked before Adobe took over Fotolia, but Adobe is checking images before they are open for sale in the database. I suppose that this is also the moment where the moderator needs to set the "safe search" flag. (As a side note: there is also a check that the model is not underage, and the photographer needs to submit a model release.)

https://helpx.adobe.com/stock/contributor/help/the-review-process.html

ABAMBO | Hard- and Software Engineer | Photographer

Votes

Translate

Translate

Report

Report
Community Beginner ,
Aug 27, 2022 Aug 27, 2022

Copy link to clipboard

Copied

Okay, if each and every photo is being seen and approved for sale before release, then whoever is responsible for marking them as "safe search" is not doing their job. So that narrows it down and makes the solution simple. Whoever is responsible for screening the images, when they see "sexual" or "no clothing", that's a clue to remove it from the safe search list. 

FYI, I just did another test with the word "murder" and the word "pornography" in the so-called safe search. Yeah, those aren't being filtered, either. There's really no excuse if an Adobe human employee truly screens each and every photo. You can't miss the images I've seen, if you are looking at them before you approve them.

Votes

Translate

Translate

Report

Report
Community Beginner ,
Aug 27, 2022 Aug 27, 2022

Copy link to clipboard

Copied

You can also add "r.a.p.e." to the list. The mark just keeps getting lower for what people think is work-place professional or decent for children to see. I understand most designers won't be looking up these words, but a child might. And if you have junk like this pop up in benign searches, it makes it more of a problem. Hilarous, I just had to retype this post because the word "r.a.p.e." is not permitted in this community, but it's "okay" to have the graphic images of this topic and sexist and pornographic imagery in your database. Incredible.

Votes

Translate

Translate

Report

Report
Explorer ,
Jul 10, 2023 Jul 10, 2023

Copy link to clipboard

Copied

I have tried the beta version of FireFly and I find it very interesting, but before tackling the purchase of the official version with a PhotoShop license there is one issue that should be clarified. As an illustrator, I need a tool that produces exactly what I want, without going into the merits. With the beta version I found myself faced with a serious problem. Not only does the AI not accept prompts like "naked woman on piano in black and white" which, in terms of art and photography, are absolutely valid, but also things like "chicken breast on the grill" that I had to generate to associate with a recipe of cooking, which is really ridiculous. Is there a possibility in the paid version to remove these absurd limits?

Votes

Translate

Translate

Report

Report
New Here ,
Jul 10, 2023 Jul 10, 2023

Copy link to clipboard

Copied

I also found some restrictions absurd. I wanted to introduce dinosaurs in some photos, and if you enter their scientific names they are all restricted. Because?? and if you can get a dinosaur it's certainly an invention of AI. This is not good.

 

Votes

Translate

Translate

Report

Report
New Here ,
Nov 11, 2023 Nov 11, 2023

Copy link to clipboard

Copied

So I have this problem - I did pregnancy photoshoot for my client and I would like to extend the white background, the problem is that I cannot use generative fill because the women is naked.All the intimate areas are covered but I still cannot use the generative fill. Is there anyway to go around this restriction or maybe there will be an update that separates extremely nude photography from actual pregnancy shoots? 

Votes

Translate

Translate

Report

Report
Community Expert ,
Nov 11, 2023 Nov 11, 2023

Copy link to clipboard

Copied

If it's a simple background, one thought  is the Crop Tool using Content Aware Fill option.. Just input the Aspect Ratio and if you have a batch of images, make it a Crop Preset.

Votes

Translate

Translate

Report

Report
New Here ,
Nov 11, 2023 Nov 11, 2023

Copy link to clipboard

Copied

That's one option but the problem is that my white background is not smooth enough and there are some spots that light was just "ugly" 😅. So I use it to create "new" white background as well, and of course it's way faster then doing content aware fill. And when I used this option it's failed anyway giving me the message that I have not enough ram spece to use this feature which is just ridiculous with 16gb 

Votes

Translate

Translate

Report

Report
Community Expert ,
Nov 11, 2023 Nov 11, 2023

Copy link to clipboard

Copied

Off the top of my head...

 

Work on a copy.

 

Select subject, expand selection slightly. 

Cotent aware fill to remove the subject. Now there is no person.

 

Extend the image using generative fill or whatever A.I. features you like.

 

Drop the original image back into the extended image, masking as needed..

 

Votes

Translate

Translate

Report

Report
Community Expert ,
Nov 11, 2023 Nov 11, 2023

Copy link to clipboard

Copied

 

quote

the problem is that I cannot use generative fill because the women is naked.

By @ms58149619

 

 

 

See guideline number 2 for Generative AI useage and find a way that does not break the guidelines.

2. Be Respectful and Safe

Do not use Adobe’s generative AI features to attempt to create, upload, or share abusive, illegal, or confidential content. This includes, but is not limited to, the following:

  • Pornographic material or explicit nudity

https://www.adobe.com/legal/licenses-terms/adobe-gen-ai-user-guidelines.html

 

We cannot offer advice to break the guidelines on the Adobe forums.

 

Jane

Votes

Translate

Translate

Report

Report
Community Expert ,
Nov 11, 2023 Nov 11, 2023

Copy link to clipboard

Copied

Can you use Remove Background, and set her against a rebuilt background?  We used manage this stuff just fine before we had Gen Fill.

Votes

Translate

Translate

Report

Report
Explorer ,
Dec 30, 2023 Dec 30, 2023

Copy link to clipboard

Copied

QINGCHARLES_0-1704003665984.png

I work on photos of professional models and often use AI features like generative fill to try and remove things left in the backgrounds of shoots, or wall electrical outlets etc that are ugly and distracting.

 

The problem is that Adobe products are made by very prude and conservative American developers for whom a bikini or a n*pple slip is a scandal. (Adobe won't let me write the scientific term for the milk gland of a mammal)

 

Sometimes I can get around it by cropping the image to take the model out, fix the issue I need and then paste the area back in, but obviously this doubles the time it takes me to edit an image. And often, it just won't do it. That box today was caused simply by having the face of a Playb*y model in the image (nothing below her shoulders).

QINGCHARLES_0-1704004170355.png

 

Can you get a discount from your subscription because the features don't operate?

 

Has anyone tried to generative fill in Luminar Neo? Does it have any of the same issues? I need a replacement product.

 

p.s. Generative Fill is fine with murder scenes, just not bare shoulders on women. Bare shoulders on men is fine. T*pless men are fine. (but the word itself is banned by Adobe) Why is this product so sexist? Is there a tort suit available if a company makes a product that so blatently violates the rights of one particular subset of the population?

QINGCHARLES_0-1704004132634.png

 

Votes

Translate

Translate

Report

Report
New Here ,
Jan 25, 2024 Jan 25, 2024

Copy link to clipboard

Copied

Agreed. I'm using the full version now, but still I cannot generate a cowboy holding a smoking gun. "Smoking" and "gun" are screened. Colt 45, screened. Looking for other AI because Adobe Firefly is censor happy. I understand that Adobe doesn't want to be held liable for assisting in creating inflammatory imagery. However, my client needs something specific, and if Adobe cannot deliver, why pay such a high price? It's like buying a power drill that yells at you when you go to use it.

Votes

Translate

Translate

Report

Report
New Here ,
Feb 14, 2024 Feb 14, 2024

Copy link to clipboard

Copied

We as the creators should decide what we should create, and their shouldn't be filters telling us what is and is not appropriate, vote to require a setting to be activated for the filter to be on, and we should choose whether to have it on or not.

Votes

Translate

Translate

Report

Report
New Here ,
Feb 15, 2024 Feb 15, 2024

Copy link to clipboard

Copied

I am trying to use generative AI on pictures containing nudity (covered nudity), not eroting in nature etc. only to expand background, why cannot I do this?

 

As I understand, adobe rules is as follows:

Do not use Adobe’s generative AI features to attempt to create, upload, or share abusive, or illegal, or content that violates the rights of others. This includes, but is not limited to, the following:

  • Pornographic material or explicit nudity

And I am trying to do none of those things. Why is adobe assuming I am trying to create or upload abusive material just because it contains naked skin (not even exposed nudity)? I am not even trying to alter the person in the photo in any way but only to expand the damn background. The photo taken during photoshoot on mututally agreed terms.

So effectively we cannot use generative fill as long as adobe detects any kind of exposed skin present on the photo?

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 15, 2024 Feb 15, 2024

Copy link to clipboard

Copied

Ai is under intense scrutiny and getting bad press from media and governments, so Adobe are being careful not to be seen promoting the misuse of Ai generated images.  

Votes

Translate

Translate

Report

Report
Community Beginner ,
Feb 15, 2024 Feb 15, 2024

Copy link to clipboard

Copied

I understand where Adobe stands on this issue but hopefully, they will at some point loosen this up at some point. 

RV

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 16, 2024 Feb 16, 2024

Copy link to clipboard

Copied

quote

So effectively we cannot use generative fill as long as adobe detects any kind of exposed skin present on the photo?

If you just want to extend background just make a duplicate, paint over the part that’s probably the trigger of the issue and expand away.

Then duplicate the result to the original image and mask it. 

Votes

Translate

Translate

Report

Report
New Here ,
Feb 23, 2024 Feb 23, 2024

Copy link to clipboard

Copied

As a dedicated wedding and boudoir photographer, I greatly appreciate the innovative features that Photoshop offers, particularly the AI functionality. It has been an invaluable tool in my workflow, allowing me to efficiently remove unwanted items from the background of my images, thus enhancing the overall aesthetic and ensuring a seamless final product for my clients.

 

However, I keep encountering a significant limitation with the AI feature that has hindered its usability in my line of work and I am really frustrated. It has come to my attention that the current guidelines restrict the use of AI when processing images containing women's bodies in lingerie. According to these guidelines, AI will not generate any edits in such cases, effectively rendering the feature unusable for a significant portion of my photographic work.

 

While I understand the importance of responsible usage guidelines, I believe that the current restrictions are unnecessarily limiting and fail to account for the diverse needs of photographers working in various genres and styles. Boudoir photography, like any other form of art, deserves to benefit from the technological advancements that software such as Photoshop offers, without arbitrary constraints that inhibit its creative potential.

 

What are you guys going to do to fix this?

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 25, 2024 Feb 25, 2024

Copy link to clipboard

Copied

Feature Requests (Idea) and Bug Reports are registered at Adobe, otherwise this is essentially a user Forum, though thankfully some Adobe employees do occasionally chime in.

So you are not talking to Adobe, but other Photoshop users. 

 

Feel free to post a Feature Request, but please do a search first and if one that reflects your wishes already exists add your support to that. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 25, 2024 Feb 25, 2024

Copy link to clipboard

Copied

As a work-around you can try blacking out the »offending« content on an additional Layer, work atop that and remove it when the background-edits are done. 

Votes

Translate

Translate

Report

Report