Copy link to clipboard
Copied
I created a script that processes a single document many times over to output variations of the document, using external files to replace inside the document.
This works perfectly, as long as scratch disk space is available. But after hours of processing, at some point
the scratch disk will be full, and a size of over 150GB.
I changed the history state settings, but it doesn't seem to affect the scratch disk size.
The document itself isn't getting bigger in size. the amount of layers stays the same.
But still, the scratch disk keeps increasing. And only a PS restart will clear up the space again.
At some point I may even want a dedicated system, processing documents for weeks without a restart.
Who knows some possibilities to keep PS running without the need for any restarts, perhaps clearing the scratch disk by script?
Copy link to clipboard
Copied
I only know that caches can be purged:
https://theiviaxx.github.io/photoshop-docs/Photoshop/Application/purge.html
https://theiviaxx.github.io/photoshop-docs/Photoshop/PurgeTarget.html
I would imagine that one would need to quit Photoshop for the scratch disk to be safely released, but that is just a guess. You didn't mention your OS (as calling an external system script might be reqired).
Good luck!
Copy link to clipboard
Copied
Thanks for your advice.
I'm currently testing if this "purge all" function will make a significant improvement.
var d1 = new ActionDescriptor();
d1.putEnumerated(charIDToTypeID( "null" ), charIDToTypeID( "PrgI" ), charIDToTypeID( "Al " ));
executeAction(charIDToTypeID( "Prge" ), d1, DialogModes.NO);
Copy link to clipboard
Copied
Just a little background first:
Normally, a batch that closes one file and then opens the next, should not cause the scratch file to grow. This comes up often enough that I've tested it on my own machine. The only time the scratch file grows, is when a bigger file is brought in so that it needs to be expanded.
However, once memory is reserved and a scratch file created, this will not be released by closing the file. It is recycled and reused for the next file. This is precisely the mechanism that allows huge batches to run smoothly without slowing or bogging down the system.
But then I notice you say "using external files to replace inside the document". That's probably the key here. A new document needs its own addition to the scratch file, and this extra scratch space will not be released again. So that's one way it could keep growing. You keep feeding it from the outside.
There is a "purge memory" command, but I don't think there is an equivalent "purge scratch disk" function. I assume you have to close to delete the scratch files.
Copy link to clipboard
Copied
Your script basically does this (as of my understanding):
Loop for an eternity on this: Open a file, insert an external file, do some stuff, save results, close the file.
(You may omit open and close, as your file does not grow significantly in size, the scratch disk should stay significantly the same.)
There are 2 possibilities: you have a bug in your script or there is a bug in the scratch disk handling.