• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Fake image detection aid

Explorer ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

Version:  All versions of Photoshop

 

Idea #1

When Photoshop modifies a file it saves the PS version to the file's properties at "EXIF - Software".  This shows that the file has been modified.  This, however, gives no indication of "what" or "how much" was done to the file and isn't very useful.

 

I suggest that when the EXIF data is written the current number of History stages be included.  While this won't tell what was changed it will show how much has changed.  A high number of tool usage would question the veracity of the image.

Idea No status
TOPICS
iPadOS , macOS , Windows

Views

901

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
23 Comments
Community Expert ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

@B.Smurf This is already possible using Photoshop's History Log (settings/History Log and Content Credentials) - however - this leads to SERIOUS metadata bloat and can cause a ton of issues with files further downstream.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

A fake image today is created by AI. That's potentially just one history state.

 

The days are gone when you had to work at it.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

@D Fosse Content Credentials can track AI usage in images.

Votes

Translate

Translate

Report

Report
Explorer ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

That is true but the suggestion I was planning on entering next is that
any file created by AI or changed by AI should be flagged as such. Any
file so flagged would be considered fake

Votes

Translate

Translate

Report

Report
Explorer ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

I'm not recommending saving the history states just how many of them
were there in the life of the file.
The only thing added to the meta data would be one number.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

quote

That is true but the suggestion I was planning on entering next is that
any file created by AI or changed by AI should be flagged as such. Any
file so flagged would be considered fake


By @B.Smurf

 

This is already covered by Content Credentials, which is a beta feature at this time:

 

https://helpx.adobe.com/au/creative-cloud/help/content-credentials.html

https://contentcredentials.org

 

https://contentauthenticity.org

 

https://c2pa.org

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

quote

I'm not recommending saving the history states just how many of them
were there in the life of the file.
The only thing added to the meta data would be one number.


By @B.Smurf

 

Keep in mind that Photoshop can remove metadata on export, which is by design. Then there are 3rd party tools and even some website upload software which removes metadata.

Votes

Translate

Translate

Report

Report
Community Expert ,
Jul 31, 2024 Jul 31, 2024

Copy link to clipboard

Copied

Photoshop already keeps track of this via XMP History Action metadata entries.

 

A brand new document, saved for the first time:

 

[XMP-xmpMM]     HistoryAction                   : created

 

The file was then opened, edited, and resaved once: 

 

[XMP-xmpMM]     HistoryAction                   : created, saved

 

Then opened and resaved a second time:

 

[XMP-xmpMM]     HistoryAction                   : created, saved, saved

 

Dates/times are covered under XMP History When metadata entries.

 

Photoshop is writing a whole lot of metadata even when history logging is inactive.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 01, 2024 Aug 01, 2024

Copy link to clipboard

Copied

quote

That is true but the suggestion I was planning on entering next is that
any file created by AI or changed by AI should be flagged as such. Any
file so flagged would be considered fake

By @B.Smurf

 

The picture below is an example of what is already being beta-tested in the public version of Photoshop right now, in support of the industry-wide Content Authenticity Initiative that Stephen_A_Marsh already linked to earlier. And this is just the Photoshop side of it; if the exported image is uploaded to a web page, anyone can submit it to the non-Adobe verification website for Content Credentials. The metadata is “tamper-evident” and traceable in a way similar to how blockchain is handled. That is important because it isn’t enough just to have a “flag,” but you have to have a way of securing it so that if anyone tries to defeat the flag, that tampering action will itself be flagged so that you can be suspicious of the image.

 

Photoshop-Export-As-Content-Credentials-Preview.jpg

Votes

Translate

Translate

Report

Report
Explorer ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

You guys are obviously way beyond my programming skills. Love the idea
of dragging the questionable file for verification. Slick!

Alas there is still a problem. If you print the questionable file and
then scan the print you would get a clean file. We need the image to
say, "I've been changed by AI".

Two questions.
1. Can AI scan the file and detect the changes? I'm certain it could
scan the web for component picture parts.

2. Can something be embedded in the image that would only be detected by
the verification process? Sort of an invisible watermark.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

@B.Smurf there is zero means to verify if a scanned image from a print is created using AI vs manual editing. Your proposal is science fiction. What would be the purpose of verifying such an ellaborate scenario? Is this a real-world need/concern - users printing then scanning images to bypass AI detection?

Votes

Translate

Translate

Report

Report
Explorer ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

What is so sci-fi-ish about the first and last four bytes of an image
reading "-AI-" (AF 41 49 AF)?

When the file is saved, PS would check the AI-flag and write these bytes
into the image. When the image is scanned these 4 byte combinations
will be the same. If only one corner was "-AI-" it still would be a
hit. If they were the same but didn't read "-AI-" it still would be a
hit.

What are the odds of the first and last 4 bytes of an image being the
same? This combination is so small nobody would ever notice them when
looking at the file.

Is this guaranteed to catch all scanned AI files? Probably not, but it
would catch some which is better that what we have now.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

quote

2. Can something be embedded in the image that would only be detected by
the verification process? Sort of an invisible watermark.…Is this guaranteed to catch all scanned AI files? Probably not, but it would catch some which is better that what we have now.

By @B.Smurf

 

In practice, scanning is not how your scenario is likely to be attempted. As is already being discussed elsewhere on the Internet, it's a lot easier and faster to flatten/copy and paste to another document, or take a screen shot.

 

But any method that leaves the verification metadata behind automatically makes the image untrustworthy because the info isn’t there. Just like leaving behind your driver’s license doesn’t help you if you get pulled over. Trust comes from the data being present, so its absence is automatically suspicious.

 

Also, the image watermark idea is not new. For many years, Photoshop included the Digimarc copyright watermark plug-in. It no longer does (I guess people decided it was not as great a solution as it sounds), but if you want to, you can still download the Digimarc plug-in and there are still directions on the Adobe website for using it in Photoshop.

 

You might want to bring your concerns to the Content Authenticity Initiative group, because it is notable that the watermark idea is something the industry chose to not include in the Initiative. Those companies and organizations, some of who are interested in protecting much larger collections of creative work of very high value, decided to go in the direction of tamper-evident metadata instead.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

@B.Smurf you need to understand pixels vs dot patterns when printing. Printing an image completely changes the pixel/dot structure of an image and also converts from RGB to CMYK.

Votes

Translate

Translate

Report

Report
Explorer ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

Maybe you don't understand what the problem is. The world is absolutely
awash with intentionally fake photographs. Nothing you see is real.
Nothing can be trusted.

What I want is a relatively easy way to say this photo has been
doctored. Exactly what steps were executed in the doctoring process is
irrelevant. Since Photoshop does most of the doctoring I thought they
might be interested in trying to solve this problem. Maybe not.

Just keeping track of how many tools were used on the image would be
great help. Now the only thing it shows is that Photoshop accessed the
file at some time.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

@B.Smurf the answer you seek has been presented to you - The Content Authenticity Initiative directly deals with this exact concern.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

quote

Since Photoshop does most of the doctoring I thought they
might be interested in trying to solve this problem. Maybe not.

By @B.Smurf

 

The evidence shows that they have been interested. Photoshop has tried to address this in at least two ways. The first was the Digimarc copyright watermark, which as I said was available for many years. And now, Photoshop has joined the Content Authenticity Initiative, an industry-wide standard (not something proprietary).

 

In both cases, Photoshop has been making the effort to put in some kind of verification when other image apps I have on my computer don’t seem to have made any effort at all in that direction. And maybe that’s because the problem is much more technically challenging to solve, because if it was that easy, wouldn’t everyone be doing it?

 

That last thought is something to focus on here. Let’s suppose for a second that Photoshop is making the least effort out of everybody, as you imply. So then, what is the list of other image editing apps that have rapidly implemented the obvious and superior solution? In other words, if Photoshop is totally not getting it done so that you should be using something else, which image editing app do you suggest switching to that has a working solution right now that you like, proving that the solution isn’t that difficult?

Votes

Translate

Translate

Report

Report
Explorer ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

Sorry for casting a slur on Photoshop. I love this program and have
been using it since PS 5. And I will keep on using it until I go to the
great editing house in the sky.

Back in PS 5 days newspaper editors were having a prob with edited
newspaper photos. Some edits are meaningless - focus, exposure,
scratches. Some are more meaningful - cropping and removal of objects.
And some completely change the meaning of the picture. The problem was
how to determine the meaningless edits from the meaningful. This is
much more difficult when all you have is the edited file to work with.
Photoshop said they were looking into this.

I have thought about this over the years and as far as I can tell PS
never came up with a solution. In the meantime putting starlet heads on
naked bodies became a thing. Annoying, obviously fake but not of any
national concern. Comes 1971 when a photo of Jane Fonda and John Kerry
appeared, a composite of two separate photos. This is a lot more
serious. It was now affecting our national perception of the truth.

Fast forward to our present day when fake photos are so prevalent that
you can't believe anything you see.

So here I am at 3 am when I have an "Ah Hah" moment. I know how, with
one little change, to identify a slightly changed photo from a massively
changed photo. Every time PS changes the image you get another History
State. Simply count the history states and save the total in the EXIF
properties. We don't need to know what the changes were - just that
there were a lot of them. The photo of Jane would have taken at least
fifty commands to paste in the image and blur the edges to look like
one image.

Implementing this suggestion would take maybe 5 lines of code, it could
be implemented in the next update and would not cost millions of
dollars. It will not require rewriting the documentation or training
users and will have absolutely no affect on program execution or size.

Will this solve all the problems of identifying fake photos? Obviously
not. But it will help.

Rick

Votes

Translate

Translate

Report

Report
Explorer ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

I T D O E S NOT

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 02, 2024 Aug 02, 2024

Copy link to clipboard

Copied

@B.Smurf 

 

As I previously wrote, it's already in the software by default, found in XMP metadata, specifically:

 

XMP History Action

XMP History When

 

There are other options, such as embedding concise or verbose Photoshop history metadata among other optional or required metadata entries that are written to a file. There are numerous issues though. Not all file formats support all metadata. Metadata can be removed from within Photoshop or when exporting the image from Photoshop or afterwards by other software. 

 

Content Credentials is also addressing the AI concerns. It's early days.

Votes

Translate

Translate

Report

Report
Explorer ,
Aug 03, 2024 Aug 03, 2024

Copy link to clipboard

Copied

One last question and I'll get out of your hair.

Where does one find XMP Metadata and what software reads it. None of
the software I have ever hear of it.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 03, 2024 Aug 03, 2024

Copy link to clipboard

Copied

quote

Where does one find XMP Metadata and what software reads it. None of
the software I have ever hear of it.


By @B.Smurf

 

In Photoshop, Bridge etc: File > File Info

 

For a complete listing of metadata, something like the CLI utility ExifTool is the standard/defacto for working with metadata.

Votes

Translate

Translate

Report

Report
Community Expert ,
Aug 03, 2024 Aug 03, 2024

Copy link to clipboard

Copied

LATEST
quote

Where does one find XMP Metadata and what software reads it. None of
the software I have ever hear of it.

By @B.Smurf

 

There’s a pretty good article about it on Wikipedia:

https://en.wikipedia.org/wiki/Extensible_Metadata_Platform

 

If you scroll down to the “Support and acceptance” heading in that article, there’s list of software that can read/write XMP metadata.

 

I’ve got one app on my computer that supports XMP but is not on that list (LRTimelapse); it uses XMP to exchange raw edit data with Adobe raw processing apps like Lightroom Classic and Camera Raw.

 

And if you’re looking in the File > File Info dialog box, you can see the actual metadata code by clicking the Raw Data tab.

Votes

Translate

Translate

Report

Report