• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
3

Race issue for AI Image

Community Beginner ,
Oct 18, 2023 Oct 18, 2023

Copy link to clipboard

Copied

Hey, I've been testing Firefly for quite some time . However, I've observed that whenever I mention 'woman', 'men', 'male', or 'female', Adobe Firefly consistently returns images of Indians. I'm not sure about the reason, as I've never specified a location or any other detail in the prompt, I am from india.

Bug Unresolved
TOPICS
Imaging

Views

1.7K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , Oct 24, 2023 Oct 24, 2023

Thank you for reaching out. We use the location indicated in your profile to show people with similar ethnicities to your region. If you have a specific ethnicity you want to see in your images, please indicate it in your prompt. This will override the region default.

Votes

Translate

Translate
15 Comments
Community Expert ,
Oct 18, 2023 Oct 18, 2023

Copy link to clipboard

Copied

Hi,

 

as you stated yourself if you are from india and your Browser locating in that region the AI try to be also in that region. For me, when I do not specify other regions, I get more European citizens. If you want other regions of the world, you can specify that.

regards,
Henrik

Votes

Translate

Translate

Report

Report
Community Expert ,
Oct 18, 2023 Oct 18, 2023

Copy link to clipboard

Copied

Interesting one this @Ashutosh32207281my2e,

The engine is clearly geo-tagging.

Perhaps add the types of people you require?

You can also add negative prompts now.

Let us know how you go

Best

mj

Votes

Translate

Translate

Report

Report
Adobe Employee ,
Oct 24, 2023 Oct 24, 2023

Copy link to clipboard

Copied

Thank you for reaching out. We use the location indicated in your profile to show people with similar ethnicities to your region. If you have a specific ethnicity you want to see in your images, please indicate it in your prompt. This will override the region default.

Votes

Translate

Translate

Report

Report
New Here ,
Jan 06, 2024 Jan 06, 2024

Copy link to clipboard

Copied

HI,

 

I would also like to report a racial and gender bias with Firefly. When I used the prompt: "A group of humanoid AI sitting in a circle around a lotus flower." returned results that were of Indian and African American women with some technology on their bodies. Further experiments produced mixed results even with an upload of my image of male and female AI with silver skin. How is this "Regional" if I live in Canada? This tells me that your system is biased based on contemporary social preferences for Diversity biased toward BIPOC mandates of some kind.

Votes

Translate

Translate

Report

Report
Explorer ,
Jan 14, 2024 Jan 14, 2024

Copy link to clipboard

Copied

Can someone from Adobe explain this screenshot? I don't think the current answers are sufficient to explain this.Screenshot 2024-01-14 at 10.51.22 PM.png

Votes

Translate

Translate

Report

Report
Community Expert ,
Jan 15, 2024 Jan 15, 2024

Copy link to clipboard

Copied

quote

Can someone from Adobe explain this screenshot? I don't think the current answers are sufficient to explain this.



 

the term that describes ethnicity would be "caucasian". 

Votes

Translate

Translate

Report

Report
Explorer ,
Jan 15, 2024 Jan 15, 2024

Copy link to clipboard

Copied

Okay Monika. I will use the word "caucasian" for you and the folks at Adobe. (But to keep a level playing field, you should look up the word "caucasian" for clarification, because, according to the experts, "The Caucasian race is an obsolete racial classification of humans".)

Now for the hard questions...

Does Adobe consider the word "white" to be racist?

Because Adobe accepts the use of the word "black" to describe what we now term "people of color". It also allows the use of the word "brown" to describe what appear to be women of Hispanic or Mediterranean descent.
When I playfully asked Firefly for a "purple woman riding horse", it gave me a black woman riding a purple horse.

I tried other colors for people, too... just for fun (because, who knows, maybe I want to create a blue person who looks like a smurf some day.)

I tried all sorts of different colors for people and it never gave me a "caucasian" woman as an option. Why is that? Aren't "Caucasian" women part of the pool of humans that AI should be pulling samples from?

I understand that making AI understand all the nuance of humanity is an incredibly difficult task, but we're people, right? We use a wide variety of words to describe things. Limiting what those words describe and eliminating the use of some of those words reduces the ability of AI to interpret the subtleties of the human condition. 


Does being so inconsistent with race and gender seem like a slippery slope?

 

Votes

Translate

Translate

Report

Report
Community Expert ,
Jan 15, 2024 Jan 15, 2024

Copy link to clipboard

Copied

quote

Does Adobe consider the word "white" to be racist?

 

I do not know that. But I would assume that if they would, then the word would most probably be banned.

 

But since your only problem is that you cannot generate what you want using that word, I would deduct that it is just not precise. I mean, when the word white gives you people with black skin, then possibly Firefly misinterpreted your choice of terminology, right? And probably just didn't associate "white" with the color of the skin.

 

Votes

Translate

Translate

Report

Report
Explorer ,
Jan 15, 2024 Jan 15, 2024

Copy link to clipboard

Copied

Very possible, Monika. 🙂

But Adobe doesn't actually "ban" words, do they? They just give the "can't load" error instead. 😉 
("vomit", "drunk", "punch", "knife"... though "sword" is acceptable). It's the inconsistencies that are going to be the big hurdle to users getting what they want. The world isn't all perfect and shiny. It can be dirty and mean. And as creators, we have to be able to illustrate whatever ideas we are tasked with illustrating... good or bad. And if Adobe is trying to eliminate the bad (which is at least 50% of real life), they'll get left behind in the AI race.

 

I appreciate you being open to the discussion. Different perspectives help me wrap my head around it all.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jan 15, 2024 Jan 15, 2024

Copy link to clipboard

Copied

quote
But Adobe doesn't actually "ban" words, do they? They just give the "can't load" error instead. 😉 

 

There are banned words. And if you use them you will get a red box with an alert pointing to the guidelines.

Votes

Translate

Translate

Report

Report
Explorer ,
Jan 15, 2024 Jan 15, 2024

Copy link to clipboard

Copied

Haven't seen those notifications yet.
I've tested out a bunch of "iffy" words (including "gun", which should have crashed the server :))) and only get this... how bad must the word be for me to get the red error? 
Screenshot 2024-01-14 at 11.39.38 PM.png

Votes

Translate

Translate

Report

Report
Community Expert ,
Jan 15, 2024 Jan 15, 2024

Copy link to clipboard

Copied

OK, that looks like the word is banned as well. Maybe they are using the red alert only inside the applications.

 

I also assume that the list of words might be fluid, since people complain and then maybe words get re-assessed.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jan 15, 2024 Jan 15, 2024

Copy link to clipboard

Copied

Hi @holtwebbart ,

 

We've asked for this feature request before and we make the request again.

 

If the engine is unhappy with words or phrases, these can be highlighted with alternate suggestions.

 

It is Ai afterall right?

 

Besides the engine is making a decision based on an algorithm.

 

Simply sharing the words or phrases it finds inappropriate would help creatives instead of us trying to guess what it is unhappy with.

 

Tx

mj

iMSD

Votes

Translate

Translate

Report

Report
New Here ,
Mar 20, 2024 Mar 20, 2024

Copy link to clipboard

Copied

Clearly it seems your perception of ethnicities according to geolocations is very biased. I'm in Colombia and all I get is Indian and African people. Latin American people can also be white. I find this geolocation thing useless

Votes

Translate

Translate

Report

Report
New Here ,
Apr 17, 2024 Apr 17, 2024

Copy link to clipboard

Copied

LATEST

As I'm in Japan, I only get Asians, but just random asians. 
For Japanese people they look like chinese people. Also its very sterotypical clothes. 
Writing women in Pink dress, brings either Vietnamese clothes, or Chinese clothes. 
Which makes no sense... Its very american way of grouping people...

Votes

Translate

Translate

Report

Report