Copy link to clipboard
Copied
For artistic work, I'm directing a film on the subject of domestic violence. I'm looking for the possibilty of AI generated images regarding portraits of domestic violence on children. I understand this is a sensitive subject. But instead of looking for real images, which are much more gruesome than the AI images, I would like to use Firefly to generate me these kind of images. Whenever I invovle the word "scar" or "wound" within the search column, it fails to generate accordingly. This happens only to women and children. But when you replace, "girl" or "woman" with "man", suddenly it's possible to generate a scap upon the image according to the description given. I understand that to inflict pain or signs of violence upon a child's of woman's face is sensitive and I completely understand not to show signs of violence on realistic generated images, but as a piece of art or a drawing or other similar images, I don't understand the problem and personally find it discriminating towards men. Because it's not okay to inflict scars or wounds or blood upon generated images of children and women, but however it is okay to use these elements upon the images of men.
Copy link to clipboard
Copied
I'm looking for the possibilty of AI generated images regarding portraits of domestic violence on children.
By @Alexander34430343a55m
See the User Guidelines here, especially #2:
https://www.adobe.com/legal/licenses-terms/adobe-gen-ai-user-guidelines.html
and note this sentence:
"Please note that we may report any material exploiting minors to the National Center of Missing & Exploited Children (NCMEC)."
Firefly is going to ban most or all of the images you are attempting to generate. It's intentional.
Jane
Forum volunteer