Welcome Dialog

Welcome to the Community!

We have a brand new look! Take a tour with us and explore the latest updates on Adobe Support Community.


Photoshop VDI troubleshooting

New Here ,
Oct 27, 2021 Oct 27, 2021

Copy link to clipboard

Copied

We have a VMWare link clone environment with a small pool of machines with CC for those who need it.  They had been plagued by lots of bugs and issues, and we realized that we hadn't updated the software deployment in over a year (something something, everyone moving to work from home and back in the office the past year, so some things got more out of date than we would like).  So we updated the suite and *most* things are working much smoother.. except Photoshop which is the heaviest used product.  My understanding is that a VMWare link clone deployment is not officially supported, so I am not exactly looking for a real fix, but just advice on hardware allocations, settings, and troubleshoothing techniques to make the best of what we have going on.

The main issue appears to be some sort of memory leak, and the workflow is like this;
User opens Photoshop, and RAM use is ~1GB
User opens any tiny image to crop/correct/etc, and ~2-300MB of RAM is allocated.
Save or export that file, and another 2-300MB is allocated.
Close the file (not the app), and all of the ram is still allocated.
Open the next file to work on, and the growth of RAM use continues, but no ram is reclaimed until PS is closed out.
If users do not regularly close out of Photoshop, then it baloons to ~3.5-4GB of RAM used, the system runs out of memory, Windows freaks out, and the system can random restart (which deletes the machine because they are link clones), or get so locked up that they get bumped off the system, and cannot log in until we kill a process remotely to free up some ram, or recycle the machine so they can get a new one.  Rarely is much work able to be saved.

Problem #1 is that the users only have 12GB of ram on their VMs, and we can expand that a little bit; but my fear is that this extends the runway.  With the older version of Photoshop 12GB was rarely an issue as they work with a lot of very small files (in pixel count and file size), because as things were closed the ram would free up to be reused.  So while we will likely up them to 16GB of ram, that would help them last for 3-4 hours rather than 1-2... and we would really like them to be able to work all day without issue.

#2 isn't really a problem, but just looking for any guides on best settings to use when in a limited/shared environment.  I thumbed through the MS guide for running CC on their VDI systems... and they used a test system with 256GB of RAM, and a huge SSD and were like 'it all worked great'.  We have a lot of resources... but split between ~150-200 machines/users.  Only a few are using CC, so we can allocate a little more to them than others, but the less we can allocate while giving them a smooth expierence is ideal.  They are doing a lot of cropping, and touch-up work for web components.  Very small low-res files, but lots of them; not demanding work on the system, but when each file is taking 200+MB and they are trying to get through hundreds of files... that is a lot of restarts of Photoshop.

#3 also not exactly a problem, but I am not seeing much in the way of logs to help troubleshoot what is going on.  Any guidance on logs generated that I can look through?  Or ways to turn some logging on/off to get more detail while working through an issue?

TOPICS
Windows

Views

41

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Community Professional ,
Oct 28, 2021 Oct 28, 2021

Copy link to clipboard

Copied

Photoshop relies heavily on not just RAM but also scratch disk space. 12GB is not enough RAM and typically, at least a few hundred GB of scratch disk just for Photoshop is optimal.

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 28, 2021 Oct 28, 2021

Copy link to clipboard

Copied

No issues with drive space. Recently updated and the added faster space certainly helped. Our constraints are more on the ram and GPU side of things until we can update hosts, but we would like to get more time on the hosts we are using if we can.

 

We toyed with settings today which I'll post below that helped a ton. Not perfect, but much smoother for our users today. 

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 28, 2021 Oct 28, 2021

Copy link to clipboard

Copied

LATEST

Not perfect, but the settings below have helped a lot today:

 

Edit, preferences, performance

Set ram use to 3GB as the problems occurred around 3.5GB mark. Ironically I was able to get it to use well more than that limit... But had to try a lot harder.

Optomized for web and UI content which lowered the cache allocations

 

Largest difference was changing the history states (which I think are the undo option?) from the default of 50 to 10. At the very least this seems to limit the ram creep from getting out of hand quickly.

 

Had my test user in and out of several items over the course of 2 hours so far and ram use was still under 1GB. Before applying these changes they were hitting 3+GB after getting in and out of a couple files.

 

Still looking for more tips and tricks to try if anyone has some! 

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines