Every “similar video” under stock photos is NSFW, and now some are inappropriate videos of children.
Hi, all. I see others have posted similar threads in the past and I'm confused (and frankly, disturbed) as to why this is still occuring. Sidenote: I would share real screenshots but don't want to screenshot these on my company computer. I've attached a photo taken with my phone.
no matter what image I'm looking at, I get NSFW videos in "similar in videos". For example, If I'm looking at a stock image of a tree, the suggested videos underneath will be people in bed, women in lingerie, or straight-up softcore p***. This is the case for nearly every single image I look at. Again, I'm on a work computer, so not ideal. SafeSearch is turned on.
What prompted this post, however, was the most disturbing. I'm looking at a simple photo of a beach and the suggested videos below are children in bathing suits, children at the doctor lifting up their shirts, and a young girl (of age, I hope?) in lingerie.
This is very bizarre....
