Copy link to clipboard
Copied
No matter how hard I try, I can't get Firefly to generate 'vikings' with the correct nationality. I get four variations and only one is a northern European. When I ask it to generate a northern European, it makes no difference: the originally generated versions stay the same - practically identical.
@선미22530658ny81 We use the location indicated in your profile to show people with similar ethnicities to your region. If you have a specific ethnicity you want to see in your images, please indicate it in your prompt. This will override the region default.
That being said can you share a link to one of the image you created the using the prompts with the issue you described?
Copy link to clipboard
Copied
Use Caucasian in the first part of the prompt and remove all the other references to skin color.
Copy link to clipboard
Copied
Where is your location when you're writing a prompt, because sometimes firefly using your current location to understand preferences of people skin color living near you. Try to describe very detailed about skin color.
Copy link to clipboard
Copied
I'm not sure where to find my location I have looked and looked to see if it's set on something odd. I'm in the united states thanks much!
Copy link to clipboard
Copied
It can determine it based on your IP address. That is not 100% accurate, but usually close enough.
droopy
Copy link to clipboard
Copied
Thanks I live in eastern ky and I don't use a vpm so it should say Kentucky. So not sure what's going on lol
Copy link to clipboard
Copied
I can't seem to get a white football player either. I really wonder if I'm doing something wrong?
Copy link to clipboard
Copied
Copy link to clipboard
Copied
Hello @Ollaft37801674397w,
Thank you for your message. I am sorry you are having this problem.
This issue has been discussed in these threads:
Adobe basically had three choices:
The decision Adobe made is to try to make results appear more culturally relevant based on where the user is located. There are advantages and disadvantages of doing that, and in some cases, the results can be far from what you are looking for. You can change your prompts to be more descriptive of the people and objects to override that region/culture behaviour. It would be nice for Adobe to allow this to be disabled or overridden with another location/region or completely diverse subjects.
Thanks,
droopy
Copy link to clipboard
Copied
just to let you know, when asked for indian images, AI usually generate muslim and arab images. Even the back ground structures are islamic rather than Indian. Pls consider it. Just for you information, indian tuban and muslim tuban are different. Only a few % of people in indian wear turbans. There are only 12 % of Muslims in India. Pls don't consider my observation in a religious way. This is just to correct your AI image generation.
Copy link to clipboard
Copied
Regarding this ongoing issue with Firefly, I realize it is far from perfect, and while is quite good at producing AI gen. images, it still has an annoying issue in generating people and their nationalities. Like I have suggested before, have Firefly have a toggle switch that turns on or off 'nationality bias'. If ON, it will consider one's geolocation in generating 'culturally accurate' images if no nationality is specified in the prompt. If Off, it generates random nationalities of people and the cultural accuracy of that nationality, again, if the prompt does not specify a nationality. That would be really useful as a feature, or something along those lines. In my example, my prompt is, 'a woman surfing on a wave, flat image, simple graphic design clipart'. I feel the image is just way over the top. Of course I know how to specify more precisely the description in my prompt, but this is seemingly overly biased, IMO. My location is Japan.
Copy link to clipboard
Copied
Expand skin tones of people images to include melanated, African, African American and black people. Lack of diversity in searches
Copy link to clipboard
Copied
There is some things for this like
Search engines: Many search engines allow filtering by skin tone.
Stock photo sites: Look for sites with diverse photo collections.
Keywords: Use specific terms like "melanated skin" or "Black models."
Advanced search: Utilize advanced search options to refine results.
Support diverse creators: Choose platforms showcasing underrepresented groups.
Copy link to clipboard
Copied
@AgentMonroe if by "search" you mean prompts - the AI will generate based on specificity. It also takes into account your geo location to give more localized results. However, adding specific ethnicities will give you better expected results.
Copy link to clipboard
Copied
Copy link to clipboard
Copied
Hey, I've been testing Firefly for quite some time . However, I've observed that whenever I mention 'woman', 'men', 'male', or 'female', Adobe Firefly consistently returns images of Indians. I'm not sure about the reason, as I've never specified a location or any other detail in the prompt, I am from india.
Copy link to clipboard
Copied
Hi,
as you stated yourself if you are from india and your Browser locating in that region the AI try to be also in that region. For me, when I do not specify other regions, I get more European citizens. If you want other regions of the world, you can specify that.
Copy link to clipboard
Copied
Interesting one this @Ashutosh32207281my2e,
The engine is clearly geo-tagging.
Perhaps add the types of people you require?
You can also add negative prompts now.
Let us know how you go
Best
mj
Copy link to clipboard
Copied
Thank you for reaching out. We use the location indicated in your profile to show people with similar ethnicities to your region. If you have a specific ethnicity you want to see in your images, please indicate it in your prompt. This will override the region default.
Copy link to clipboard
Copied
HI,
I would also like to report a racial and gender bias with Firefly. When I used the prompt: "A group of humanoid AI sitting in a circle around a lotus flower." returned results that were of Indian and African American women with some technology on their bodies. Further experiments produced mixed results even with an upload of my image of male and female AI with silver skin. How is this "Regional" if I live in Canada? This tells me that your system is biased based on contemporary social preferences for Diversity biased toward BIPOC mandates of some kind.
Copy link to clipboard
Copied
Can someone from Adobe explain this screenshot? I don't think the current answers are sufficient to explain this.
Copy link to clipboard
Copied
Can someone from Adobe explain this screenshot? I don't think the current answers are sufficient to explain this.
the term that describes ethnicity would be "caucasian".
Copy link to clipboard
Copied
Okay Monika. I will use the word "caucasian" for you and the folks at Adobe. (But to keep a level playing field, you should look up the word "caucasian" for clarification, because, according to the experts, "The Caucasian race is an obsolete racial classification of humans".)
Now for the hard questions...
Does Adobe consider the word "white" to be racist?
Because Adobe accepts the use of the word "black" to describe what we now term "people of color". It also allows the use of the word "brown" to describe what appear to be women of Hispanic or Mediterranean descent.
When I playfully asked Firefly for a "purple woman riding horse", it gave me a black woman riding a purple horse.
I tried other colors for people, too... just for fun (because, who knows, maybe I want to create a blue person who looks like a smurf some day.)
I tried all sorts of different colors for people and it never gave me a "caucasian" woman as an option. Why is that? Aren't "Caucasian" women part of the pool of humans that AI should be pulling samples from?
I understand that making AI understand all the nuance of humanity is an incredibly difficult task, but we're people, right? We use a wide variety of words to describe things. Limiting what those words describe and eliminating the use of some of those words reduces the ability of AI to interpret the subtleties of the human condition.
Does being so inconsistent with race and gender seem like a slippery slope?
Copy link to clipboard
Copied
Does Adobe consider the word "white" to be racist?
I do not know that. But I would assume that if they would, then the word would most probably be banned.
But since your only problem is that you cannot generate what you want using that word, I would deduct that it is just not precise. I mean, when the word white gives you people with black skin, then possibly Firefly misinterpreted your choice of terminology, right? And probably just didn't associate "white" with the color of the skin.
Copy link to clipboard
Copied
Very possible, Monika. 🙂
But Adobe doesn't actually "ban" words, do they? They just give the "can't load" error instead. 😉
("vomit", "drunk", "punch", "knife"... though "sword" is acceptable). It's the inconsistencies that are going to be the big hurdle to users getting what they want. The world isn't all perfect and shiny. It can be dirty and mean. And as creators, we have to be able to illustrate whatever ideas we are tasked with illustrating... good or bad. And if Adobe is trying to eliminate the bad (which is at least 50% of real life), they'll get left behind in the AI race.
I appreciate you being open to the discussion. Different perspectives help me wrap my head around it all.
Copy link to clipboard
Copied
But Adobe doesn't actually "ban" words, do they? They just give the "can't load" error instead. 😉
There are banned words. And if you use them you will get a red box with an alert pointing to the guidelines.