Copy link to clipboard
Copied
Perhaps it's the image metadata describing the image or video or something else, but I've noticed some strange results. I wanted to find some contrasting stature images of people by entering the words: 'tall man, short woman', or visa versa. A significant portion of the results have nothing to do with those words. What I have seen are many images of woman mostly, showing their middle finger 'flipping the bird', and in some strange positions. I don't know, does the search algorithm take those keys words as 'offensive', meaning the woman is short compared to a man, and the output should be '-f- you?'
Copy link to clipboard
Copied
No, but if you are looking at the keywords of those women with the middle finger, they are missing the tall man, but have short hair or are clothed in shorts and things like this. I doubt that the search algorithm is very “intelligent” in that sense that it shows exactly what you want. But if you go for { "tall man” and “short woman” } you get a very restricted collection.
The search term { tall man and short woman -finger } shows better results. I suppose that you also need to add { -“middle finger” }
Searching is a complex affair. The contributor needs to think like a customer to enter the correct keywords. And the customer needs to go from a very wide range to a narrower range when they want to get more relevant assets.