Skip to main content
Participant
May 23, 2023

P: Generated images violate user guidelines

 

So as you can see, it's a PG-13 relatively inoffensive image of a woman in a bunny outfit. The top worked fine, and I was able to complete the top ear, which is cool. When I tried to extend the bottom with generative fill, though, I got this warning. They're just a pair of legs wearing stockings, and I wanted to extend it.

It feels like a false flag - though I could be wrong? I find myself thinking it would do the same for women in swimsuits.

Figured I'd share here.

1084 replies

rayek.elfin
Legend
August 24, 2023

@zvi_t Ah, good to know. It does validate my basic argument: Adobe does not want offending imagery in their image stock. With many contributors it is difficult to keep track of all the content, which is a classic challenge for any stock library.

Known Participant
August 24, 2023
quote

Ask yourself: does Adobe's stock image library include offending imagery? No, it does not.


By @rayek.elfin 

 

Yes it does. Occasionally, I come across images and videos on Adobe stock that are illegal. Because I got banned from the forums last time for being too specific (instead of thanking me for helping keep Adobe stock safe), I cannot elaborate. All I can say is that I now send my findings to an Adobe employee, and they are removed within a few days.

rayek.elfin
Legend
August 24, 2023

@Encartauk  If all of that is indeed the case as you say... Ask yourself why Adobe added a piece of code to avoid counterfeiting money in Photoshop. 

https://helpx.adobe.com/photoshop/cds.html

 

The difference is that full-blown limitless generative AI is akin to providing your users with an endless library of images that include a wide range of potential offending ones. In classic image editing and compositing users import existing images or paint/draw their images. Adobe is not reponsible for what their users import or create with their own hands. With generative AI a model built by Adobe gains  access to an built-in facility to generate a sheer limitless library of images within Photoshop itself/via Adobe's servers. The context is completely different compared to users importing external images or drawing their own pictures. Users do not generate these AI images out of the void: the AI models are based on the library of images Adobe fed it.

 

Ask yourself: does Adobe's stock image library include offending imagery? No, it does not. So why would they allow their generative AI to act differently?

 

Also, with generative AI in the news and a "hot item" I doubt if Adobe would want their software mentioned on the evening news in the same sentence with various forms of celebrity nude fakes, fraude, crime, and abuse because, whether we like it or not, generative AI lowers the threshold to creating believable faked imagery to a level where even novices can pull these off relatively easily. It is a potential marketing nightmare waiting to happen.

 

Btw, I am all for freedom of information accessible to all. But it has been proven again and again that a majority of humans living on Earth cannot be trusted with total freedom (yet). Which is unfortunate, and until humanity has proven itself to be a trustworthy lot and be kind to each other, I'd say that it is probably wise that Adobe limits their generative AI. So I understand and empathize with their point of view, even if I disagree with Adobe on the principle of full freedom of information and agree in principle with you @Encartauk. But with the current state of humanity Adobe is sort-of forced to play "nanny" in this particular case. Nothing PG rated or above 😉

 

Besides, (this is an important point) allowing Photoshop users to generate whichever image they want would also be extremely problematic in regard to their young user base: Adobe cannot be make itself responsible for and allow kids to use Photoshop for generating lewd imagery. Parents, schools, and various religious institutes would not be amused!

 

Then Photoshop would become an adult-only software? That's just not viable. Adobe would have to release different versions for different age groups? How is that going to work? Total nightmare on so many levels to deal with. Have you thought about young users and schools at all and how they add up in this equation? I am sure you will agree with me that generative AI should be limited when kids use it to generate images. Or within a school environment.

 

Nope, yet another reason to integrate a kid-friendly generative AI and avoid all that potential trouble for Adobe. Definitely do not want to open that can of worms.

 

Just playing the devil's advocate here: Adobe'd be stark mad to allow for full artistic generative AI freedom at this point in time (or ever!).

 

Besides --I repeat once more-- the tools for ultimate freedom in generative AI are out there if you need them.

Participating Frequently
August 24, 2023

@Encartauk

Excellent points! Fully agree with your rebuttal.

Known Participant
August 23, 2023

If someone uses a brand of car to mow people down, does the car company get called into court for enabling and assisting? If someone paints an inappropriate picture of a public figure on a canvas with oil paints, do those companies that make those tools get called into court for enabling and assisting? If someone uses a brand of laptop running a certain OS and hacks into a bank stealing billions, do those software companies get the same? 

People have had the means (including using Photoshop) to create fake images that could've been used in fraud/defamation for decades - has Adobe had to go to court previously for any of those reasons?

As I said, once the image is created, it's then down to whatever happens after that and what laws it breaks in the environment it will exist within. If it's for someone's personal use, not a problem unless it still breaks laws that would already apply e.g. ch!ld pron. If it was to be violent, and was on public display then it could fall into public decency violations or whatever. If it was used to defame someone, (and I'm not saying it's OK to do so, but how would an AI filter this out correctly) then the person that created it would fall foul of the law, why would it be anything to do with Adobe? Again, Adobe has already been providing means to manipulate images for decades, why is this any different?

rayek.elfin
Legend
August 23, 2023

@Encartauk 

quote
Better solution: don't block anything and let people create what they want and then they can later suffer the potential consequences with wherever they use the image next - just as they would with painting, drawing, sclupting, photography or any other form of art that springs to mind.


Adobe cannot allow their users to generate whichever image they want, because (as far as I can see) that would open the floodgates to potential legal consequences. Just imagine if it is proven in court that their generative AI tools were used in a fraud case. Potentially (in the US especially so) this could lead to Adobe being hauled into court for enabling and assisting. There is a reason why Adobe Photoshop refuses to open and save images of dollar bills. It's merely a giant company protecting itself.

 

I sincerely doubt this will ever change. If a user wants to generate anything with a generative AI, the only option is to opt for other alternatives outside the Adobe ecosystem of software. With the added benefits of qualitative better results and much more options. And running generative AI tools on your own local machine opens the path to full freedom in regard to whatever images you'd like to generate. Although this does require more expensive hardware and a bit more knowledge to install and setup.

 

It is what it is. Other options exist if Adobe's generative AI turns out to be too restrictive for your work.

Known Participant
August 23, 2023

I'm not being naive, guys. As a software engineer myself, I don't expect GF to go from beta to stable in three months. GF, which uses FireFly, was not developed within three months. There was FireFly out there before PS incorporated it as GF beta.

I won't debate this, I'm entitled to my opinion. If GF started generating inappropriate images, Adobe would have the means to fix that sooner than you can say "generative fill." It's not that they don't have the means to fix the daily reported false violation alerts, it's just that it isn't their priority. Beta neural filters are in beta because Adobe is just too busy with other things to have time to fix them.

Also, if they were smart, they'd give users the option to generate just one generation instead of three. This would reduce their server usage significantly (which is very expensive), and at the same time, reduce the generation time for the user by a third.

daniellei4510
Community Expert
Community Expert
August 23, 2023

The neural filter "Depth Blur" has been in beta since, if I recall, 2021. Live Gausian fill was once in beta, then introduced in Adobe Photoshop proper, and has since been removed even from the beta version of Photoshop. Apple's operating systems, beginning with Developers and progressing to public betas last nearly a year before being introduced to the general public. It's a long and painstaking process and 3 or 4 months is a very short time in comparison.

Adobe Community Expert | If you can't fix it, hide it; if you can't hide it, delete it.
Kevin Stohlmeyer
Community Expert
Community Expert
August 23, 2023

@zvi_t but it is a beta, meaning this is all on a voluntary basis and by definition is a work in progress. No one is forcing anyone to participate. Honestly three months to develop something as complex and nuanced as an AI interface in such a robust imaging software package is short-term IMO. Expecting Adobe to be able to roll out something that is this complicated in such a short time is rediculous.

Known Participant
August 23, 2023

This post titled "Generated Images Violate User Guidelines" was published on May 23, 2023. It's been three months and 43 pages of discussion, yet there have been no changes regarding these incorrect warnings. People can only be patient with the "beta" excuse for so long.