Skip to main content
fairhopecr3
Participant
June 26, 2017
Question

How can I remove junk RAW DATA from file?

  • June 26, 2017
  • 3 replies
  • 9130 views

Hello everyone,

I am hoping one of you can help me.

I have imagery that has excessive amounts of junk information/data in the files Raw Data.

I am talking hundreds or lines such as,

               <rdf:li>006866056250D71BF7680982B0D05DF3</rdf:li>

               <rdf:li>006DD14C462C310A9EF1CBDBCA7B7495</rdf:li>

               <rdf:li>00705EFBC9A5EE18A2EED4A717FB5F3B</rdf:li>

               <rdf:li>007658278627CC4858C4936D67F0DF67</rdf:li>

               <rdf:li>0078FB3D86D842E685C914C2D00A2E40</rdf:li>

               <rdf:li>0079BF078397D573B94DED2786347029</rdf:li>

               <rdf:li>0081E97051E1C1286376E67546DC1F9C</rdf:li>

               <rdf:li>0084E230C00349DC642AC80C2170155A</rdf:li>

               <rdf:li>00851DEE279751E47814F624C7D3F464</rdf:li>

               <rdf:li>00852F3629FD42D645AD7D1CD73D0C4E</rdf:li>

               <rdf:li>008704B02FF85F234D4D0F89072C6796</rdf:li>

               <rdf:li>00878D4A885675232111A258BC24458F</rdf:li>

               <rdf:li>008833914D82F173A079227F0DB988A7</rdf:li>

               <rdf:li>008AF0CF7AC7E67525301FB6AE481ADB</rdf:li>

               <rdf:li>008BC64A056514C18902F6F2CE088528</rdf:li>

               <rdf:li>008F67961E96136A1D348885E296E63F</rdf:li>

               <rdf:li>00959AD22DF8E112CC4D5D4EE7574935</rdf:li>

               <rdf:li>00988D8BF2DA968D5672B08031534596</rdf:li>

               <rdf:li>0099345CD06DDE9EE8E15AC91B4FE71C</rdf:li>

               <rdf:li>009B4FDA6C18CB8E8138D3402E4952A5</rdf:li>

               <rdf:li>009BDAEEBC2332EA4708078E3F9CF771</rdf:li>

               <rdf:li>009C343AE445CAA695C298107FFCFE65</rdf:li>

               <rdf:li>009DFBD4C9F04397A664A71A4465BE52</rdf:li>

               <rdf:li>00A946DD7AE61C92FAD4FA692989CE84</rdf:li>

               <rdf:li>00ABF1CB64AB8E17AD44227EFDE04411</rdf:li>

               <rdf:li>00AFD834575791274DF00ABA026367FD</rdf:li>

               <rdf:li>00B429296471BFB41FD5D7ADEF1E41C4</rdf:li>

etc....

These cause the file sizes to become excessively large. Is there a way to remove  this?

I used to have a script that seemed to work, but, no longer seems to remove the junk, and I cannot understand why.

It is, 

function deleteDocumentAncestorsMetadata() { 

whatApp = String(app.name);//String version of the app name 

if(whatApp.search("Photoshop") > 0)  { //Check for photoshop specifically, or this will cause errors 

//Function Scrubs Document Ancestors from Files 

if(!documents.length) { 

alert("There are no open documents. Please open a file to run this script.") 

return;  

if (ExternalObject.AdobeXMPScript == undefined) ExternalObject.AdobeXMPScript = new ExternalObject("lib:AdobeXMPScript");  

var xmp = new XMPMeta( activeDocument.xmpMetadata.rawData);  

// Begone foul Document Ancestors! 

xmp.deleteProperty(XMPConst.NS_PHOTOSHOP, "DocumentAncestors"); 

app.activeDocument.xmpMetadata.rawData = xmp.serialize(); 

//Now run the function to remove the document ancestors 

deleteDocumentAncestorsMetadata();

if anyone knows a way to clear out this unnecessary metadata, please share!

Thank you in advance!

This topic has been closed for replies.

3 replies

Q2rg
Participant
August 18, 2022

I sometimes stumble over files in our production that have millions of lines of ancestors IDs. Sometimes these files cause troubles with software used to remove these. Only thing that worked for me today in one of these cases was to remove all the data with exiftool. I'll leave the line here on this older article because it was my first find on google on the issue. Maybe it serves others.

 

exiftool -all= -tagsfromfile @ -all:all -unsafe [imagename]

 

This will remove all tags while suppressing all errors. (hence unsafe) - Use with caution. 

Stephen Marsh
Community Expert
Community Expert
August 18, 2022

@Q2rg wrote:

I sometimes stumble over files in our production that have millions of lines of ancestors IDs. Sometimes these files cause troubles with software used to remove these. Only thing that worked for me today in one of these cases was to remove all the data with exiftool. I'll leave the line here on this older article because it was my first find on google on the issue. Maybe it serves others.

 

exiftool -all= -tagsfromfile @ -all:all -unsafe [imagename]

 

This will remove all tags while suppressing all errors. (hence unsafe) - Use with caution. 


 

@Q2rg – a more targeted/safer command:

 

exiftool -overwrite_original -XMP-photoshop:DocumentAncestors= -r 'path to top-level directory or file'

 

Mac uses single straight quotes ' around file paths with spaces, while Windows uses double straight quotes ".

Akash Sharma
Legend
June 26, 2017

Moving to Photoshop Scripting

SuperMerlin
Inspiring
June 26, 2017

It might be due to the way you are saving the document, there was a problem with "Save For Web" when retaining metadata, it did not save any metadata changes that were made in Photoshop. There are simular scripts to remove this data without opening the documents for Photoshop and Bridge.

The Bridge one is here.. Bridge Script to Remove Photoshop DocumentAncestors Metadata

fairhopecr3
Participant
June 26, 2017

Well, the images come to me, initially, with all thins junk already in them. As I get hundreds of these, when many have this junk metadata, it causes my files (including the images) to get pretty big very quickly.

I hope their is a way to purge the excess, unnecessary data easily, like an action, so I do not have to do it manually, hundreds of times.

c.pfaffenbichler
Community Expert
Community Expert
June 26, 2017

Please post an effected image.