The problem:
The photoshop tech documents from Apr 2019 say that it supports 64 exabytes of scratch space ( https://helpx.adobe.com/photoshop/kb/optimize-photoshop-cc-performance.html )
but when i reach around 2.2TB - 2.5TB, i start getting "scratch disks are full" error even when there is plenty of space left on the disk and more on further disks in the scratch disk priority list (their order doesnt matter, it simply happens at 2.2TB+ no matter what disk is being used when thats reached).
The scratch files are 68gb each and the last one at the error point is smaller than others, just like when i genuinely run out of space so it tries to fit in the last bit, except this is an invisible wall and there is still plenty of space and disks left.
To reproduce the problem:
create a big document like at least around 5-7GB (can be more, this doesnt matter, but we will need multiple files, not one supergiant doc) by creating a canvas of say 150 000 x 150 000 pixels, scale some image up inside the document to fill those pixels and save it. if the filesize is big enough, duplicate this file perhaps 10 times on disk and rename them each so you can then open them each as separate different document inside photoshop at the same time (will probably take a few minutes to open based on pc), then check total photoshop scratch size in the lower left info panel. if its close to 2TB, then you can start making a single brushstroke in each of the opened documents which should produce additional big scratch file for each file on the very first stroke (only the very first stroke in the doc, subsequent ones wont create further scratch files), and so doing that in each opened doc should get you to the problem area of 2.2-2.4TB. if not, then open more docs to get there.
- At around that point you should get scratch disks full error even if there is plenty of space left, each attempt to make a brushstroke that produces the error will create two zero sized scratch files. an attempt to save a doc with the scratch size full error will produce a zero sized .tmp file at the document's path.
If you were able to get to 2.6TB and further without errors, please please let me know.
What i tried:
- Fresh installation of PS 2018 19.1.7 64bit and PS 2019 20.0.4 64bit
- Both on windows8.1 64bit (legacy bios mode + all disks mbr)
- Both on fresh windows10 64bit 2019 (fully uefi mode and all disks gpt) (all drivers were also different from the ones used in the win8.1)
- Tried on a completely different pc with different hardware, os and drivers, and the problem persisted
- Tried using various combinations of physical scratch disks that add up over at least 3TB, such as just a single 3TB disk or 1TB + 1TB + 1TB + 3TB completely different disks, but always the same problem at around 2.0-2.5TB, no matter which disk the scratch data is currently being written to when the limit is reached, different disks or the order makes no difference. No other storage devices or anything unusual connected to the motherboard.
- No other apps or os are accessing the disks or limiting anything. (for the windows10 test, and the other pc test there was literally just photoshop installed)
- Tried with VMCompressPages 0
- Clearing cache doesnt help
- i have NOT tried with just a single giant document with many layers to reach that limit as opposed to with multiple docs
Questions:
- could it be that the "support for exabytes of scratch size" claim is untested in the later versions or perhaps does that only apply to a SINGLE giant document with many layers and not to multiple documents being open and worked with at the same time like in my case? or does it need some special hardware or non-windows os to utilize? or anything else that i could try to test?