Photoshop is unable to use more than 2-2.5TB of total scratch size (but the tech docs say it supports 64 exabytes)
The photoshop tech documents from Apr 2019 say that it supports 64 exabytes of scratch space ( https://helpx.adobe.com/photoshop/kb/optimize-photoshop-cc-performance.html )
but when i reach around 2TB - 2.5TB, i start getting "scratch disks are full" error even when there is another 2TB space left on the disk and more on further disks in the scratch disk priority list (their order also doesnt matter).
The scratch files are 68gb each and the last one at the error point is smaller than others, just like when you genuinely run out of space so it tries to fit in the last bit, except there is still plenty of space left.
To reproduce the problem:
create a big document like at least around 5-7GB (can be more, this doesnt matter, but we will need multiple files, not one supergiant doc) by creating a canvas of say 150 000 x 150 000 pixels, scale some image up inside the document to fill those pixels and save it. if the filesize is big enough, duplicate this file perhaps 10 times on disk and rename them each so you can then open them each as separate different document inside photoshop at the same time (will probably take a few minutes to open based on pc), then check total photoshop scratch size in the lower left info panel. if its close to 2TB, then you can start making a single brushstroke in each of the opened documents which should produce additional big scratch file for each file on the very first stroke (only the very first stroke in the doc, subsequent ones wont create further scratch files), and so doing that in each opened doc should get you to the problem area of 2.2-2.4TB. if not, then open more docs to get there.
- At around that point you should get scratch disks full error even if there is plenty of space left, each attempt to make a brushstroke that produces the error will create two zero sized scratch files. an attempt to save a doc with the scratch size full error will produce a zero sized .tmp file at the document's path.
If you were able to get to 2.6TB and further without errors, please please let me know here.
- However i havent tried with just a single giant document with many layers as opposed to multiple docs
What i tried:
- Fresh installation of PS 2018 19.1.7 64bit (all preferences wiped, just disabled autosave, compression and psb compatibility)
- Fresh installation of PS 2019 20.0.2 64bit (same as above)
- Both on windows8.1 64bit (all legacy bios mode with mbr disks)
- Both on fresh windows10 pro 64bit (fully uefi mode and all disks gpt) (all drivers were different from the ones used in the win8.1, however its still the same pc)
- Tried on a completely different pc with different hardware, os and drivers, and the problem persisted
- Tried using various combinations of physical scratch disks that add up over at least 3TB, tried with only the os disk (gpt) + one 3TB gpt disk connected with no other storage devices or anything unusual connected to the motherboard, but always the same problem at around 2.0-2.5TB, no matter which disk the scratch data is currently being written to when the limit is reached, different disks or the order makes no difference.
- No other apps or os are accessing the disks or limiting anything. (on the windows10 test, there was literally just photoshop installed)
- Took a look at os pagefile but photoshop doesnt seem to touch that at all, at least not to just open these docs and make a stroke in each.
- Tried with VMCompressPages 0
- Clearing cache doesnt help
Questions:
- so is that "support for exabytes of scratch size" claim untested in the later versions or perhaps does that ONLY apply to a single giant document with many layers and not to multiple documents being open and worked with at the same time like in my case? or does it need some special hardware or os to utilize?
(i made a similar thread some time ago, feel free to delete it, it had misleading title as i havent done enough testing, bloated the wording and the discussion contained zero value for future users with the same problem anyway)
