Copy link to clipboard
Copied
I have done numerous hours of research over the years only to give up and come back to this subject later on. I will try again!
I must ask: is there ANY way to get Photoshop CS4 to automatically flush used RAM after closing image files? I'm tired and annoyed of CS4 eating more and more RAM without freeing anything after an image has been closed. An image shouldn't still eat up memory when it isn't even open anymore. It seems to be terrible RAM management of the program itself and how Adobe has forgotten to write that little snippit of code to clear the RAM after an image has been closed.
I know this is not because of:
Cache levels
Setting too large of a saved history state; purging history cache or any kind of cache still never frees RAM
A lack of updates; I have been up to date with CS4 very punctually as they were released
The computer I am on; this poor RAM management has been around since CS2 and it's the same on the last 4 computers I have used
Does CS5 still do this too?
Copy link to clipboard
Copied
Noel Carboni wrote:
Are you thinking of something other than a PC or Mac system? These are not multi-user systems as far as I know, at least not in the "multiple users can use Photoshop simultaneously" sense...
While multiple users can indeed log into a Windows terminal server system and run apps independently, the necessity of having accelerated graphics needed by Photoshop just isn't there (Photoshop will not use accelerated OpenGL when running across RDP), and you can't tie more than one physical console (monitor/keyboard/mouse) directly to a PC.
I have only witnessed PCWin development systems with 10 or so users who connect via emulator software, all logging in as different entities. (Yes, effectively using their desktops as dumb terminals).
You make an excellent point about accelerated graphics and dumb terminals and frankly I don’t know enough about the requirements of PS users and the impedances caused by using a system in such a way. One limitation right off the top of my head, monitor calibration, would be very difficult.
I guess I wont run my mouth trying to defend the multi user angle anymore.
On one hand “it seems” Adobe made a business decision to do it, on the other hand if they did do it for that reason, how are the usage limitations dealt with. I just don’t know.
So from here I bow to your arguments previously made in the posts above.
I'm less confident that multiple consoles cannot be tied to a Mac.
There used to be about 7 years? ago. The custom TERMU I knew of died with the popularity of HTML.
Copy link to clipboard
Copied
Thanks for your thoughts. I think we're on the same page.
I used to have fun setting up Microsoft Terminal Services for engineers at a multi-location company (we had big offices on both sides of the country). It turned out some operations needed high bandwidth (and highly interactive) access to databases while their user interfaces were relatively low bandwidth - a good candidate for Terminal Services where the app was near the data, even though the user was a long way off. We built some hefty servers for the task. Many things we ran worked surprisingly well, but some never *quite* as well as running them on one's own workstation. We didn't try Photoshop in that environment.
However, I develop and test graphics apps on my own systems here at my company, sometimes via RDP between different Windows systems. It is through this activity that I've learned how Photoshop and OpenGL relate to such operation. It actually seems as though most of the framework is there to be able to do the graphics acceleration on the remote UI, as OpenGL is implemented as a pipeline, but it just doesn't seem to be completely there yet and apps can just run the Microsoft OpenGL 1.1 software emulation, or use GDI and do pixel transfers across the LAN (Photoshop does this). Photoshop can be somewhat usable in this arrangement, but I sure wouldn't want to do a lot of production work that way.
I should think that if remote OpenGL implementation were completed (by Microsoft? Adobe? both?) the monitor profile on the remote "terminal" could be used to drive the color-management.
But would we really want this? Photoshop is a very resource-intense application, as mentioned above... I shudder to think of the bottlenecks in trying to use Photoshop on the same machine at the same time as half a dozen other people. Imagine everyone doing a long filter operation at the same time... Or worse, running the system out of memory...
Lastly - and it's a more basic issue - don't I recall that Photoshop licensing is not set up to allow more than one user on the same license? I could be mistaken, though, re: licenses set up specifically for businesses.
I'll stop my rambling now.
-Noel
Copy link to clipboard
Copied
I shudder to think of the bottlenecks in trying to use Photoshop on the same machine at the same time as half a dozen other people
6 core CPUs are cheap as dirt now... by the time someone tries it, 12 will be in and 6 users could have a comfy 2 cores each! Hurrah! Its the way of the future! Invest all your life's savings!
lol
Copy link to clipboard
Copied
Overfocused wrote:
6 core CPUs are cheap as dirt now... by the time someone tries it, 12 will be in and 6 users could have a comfy 2 cores each! Hurrah! Its the way of the future!
Actually, the CPU processing power isn't the bottleneck I was thinking of - access to main memory is, not to mention bus bandwidth to displays.
RAM cache on CPU cores is still in the "few megabytes" range. Images people work on in Photoshop are often in the "few hundred megabytes" range. Thus all the CPU cores will be demanding memory bandwidth. Main RAM only runs so fast.
And while you might wonder if the future holds bigger and bigger processor cache, I'd counter with saying the future holds bigger and bigger images.
You yourself have pointed out how cheap processing power is... This extends to systems. How much cheaper, really, is a thin client with a nice display than a decent workstation with a nice display? A thousand dollars? A server class machine that's good for even only a few users could easily push prices back up and you'd end up no better off.
Photoshop seems about the worst candidate for multiuser operation that I can think of, save for maybe a video game.
Graphics processing will benefit most greatly, in the future, from MORE parallel hardware, not less.
-Noel
Copy link to clipboard
Copied
Is there a Cray in your future?
Copy link to clipboard
Copied
Nah, just li'l ol' GPUs, where the REAL power is.
Think about it... Teraflop (!!) compute power on a little board you can go down to Best Buy and pick up for a few hundred bucks. The GPU revolution has been occurring all around us quietly... Remember when a gigaflop was a big deal?
-Noel
Copy link to clipboard
Copied
I remember when Intel assembled the first massively parallel computers with a bunch of Pentiums. Had a friend that worked on it.
Seems like so last century!
Copy link to clipboard
Copied
Got a spare 50 grand?
http://accessories.us.dell.com/sna/products/Processors/productdetail.aspx?sku=A3336421
Copy link to clipboard
Copied
Actually, the CPU processing power isn't the bottleneck I was thinking of - access to main memory is, not to mention bus bandwidth to displays.RAM cache on CPU cores is still in the "few megabytes" range. Images people work on in Photoshop are often in the "few hundred megabytes" range. Thus all the CPU cores will be demanding memory bandwidth. Main RAM only runs so fast.
And while you might wonder if the future holds bigger and bigger processor cache, I'd counter with saying the future holds bigger and bigger images.
(etc)
-Noel
True I was half kidding around though.
Although with tri and quad 16x SLI/Crossfire motherboards being a norm of around $150-300 now, you could probably give a GPU to each separate user. That's what current thin clients prefer anyway, no?
It'd be a fun experiment at least... memory/cache would still be the issue like you're saying, but its still fun to goof off with the whacky stuff coming out now for cheap
Copy link to clipboard
Copied
When you say "each separate user", are you implying that more than one operator actually using the workstation simultaneously?
Copy link to clipboard
Copied
When you say "each separate user", are you implying that more than one operator actually using the workstation simultaneously?
That's what a thin client host program does... lets more than one user use a machine simultaneously
Some are a bit primitive but the good ones create a separate environment for each user, and will assign a video card, mouse, and keyboard to them. Some clients won't even work without multiple video cards inside of a machine either, they make it a requirement. Others are USB driven and only have video bandwidth limited to the USB port its plugged into... for like, a web browsing computer hub.
Copy link to clipboard
Copied
Noel Carboni wrote:
I develop and test graphics apps on my own systems here at my company, sometimes via RDP between different Windows systems. It is through this activity that I've learned how Photoshop and OpenGL relate to such operation. It actually seems as though most of the framework is there to be able to do the graphics acceleration on the remote UI, as OpenGL is implemented as a pipeline, but it just doesn't seem to be completely there yet and apps can just run the Microsoft OpenGL 1.1 software emulation, or use GDI and do pixel transfers across the LAN (Photoshop does this). Photoshop can be somewhat usable in this arrangement, but I sure wouldn't want to do a lot of production work that way.
I should think that if remote OpenGL implementation were completed (by Microsoft? Adobe? both?) the monitor profile on the remote "terminal" could be used to drive the color-management.
Lo and behold it turns out Microsoft HAS been working on this. It's called RemoteFX technology, and it's supposed to be imminent - with the release of Windows 7 Service Pack 1, specifically. SP1 RTM already exists, it's just not available to the public yet.
A description, from Tom Warren's blog on winrumors.com...
RemoteFX is a new enhancement to RDP’s graphical remoting capabilities. The idea behind RemoteFX is to allow for a full remote experiences including multiple displays, Aero and multimedia streaming to all types of client devices including low cost thin clients. RemoteFX achieves this by using a technique known as host-based rendering. This technique allows for the final screen image to be rendered locally on the remote PC after being compressed and sent down to that remote host. The enhancements are expected to greatly improve video streaming across remote sessions which is currently one of the major drawbacks of virtualized computing.
-Noel
Copy link to clipboard
Copied
Bah, on initial testing it appears even though Aero is now available through RDP, RemoteFX doesn't advertise the OpenGL acceleration capabilities of the client system to the applications. In layman's terms, unless I'm missing something Photoshop won't be able to do accelerated operation through RDP even with Windows 7 SP1.
OpenGL Version 1.1.0 ()
OpenGL Vendor 'Microsoft Corporation'
OpenGL Renderer 'GDI Generic'
Available Extensions:
GL_WIN_swap_hint
GL_EXT_bgra
GL_EXT_paletted_texture
-Noel
Copy link to clipboard
Copied
I have to admit, PS (CS5, Windows XP) does slow down after I process many images. Closing and restarting PS helps but rebooting the OS is better.
In preferences, what is the best percentage setting for memory usage? I have read various post on this.
Interesting comment from Chris (thanks btw, for your feedback) - logging on as a different user. Is this a good alternative that can reduce other, unnecessary programs from running in the background?
Thanks ...
Copy link to clipboard
Copied
Reading through the reply's - not sure the average user could follow (Im a photographer and pretty low on software architecture side of the index). However, here are links to some basic information - PS performance. You may have already have found this, however, it might be useful to others.
http://www.thelightsright.com/OptimizePerformanceCS5-Part1
http://www.thelightsright.com/OptimizePerformanceCS5-Part2
Copy link to clipboard
Copied
Back to the original subject: Multitasking 101
A scenario in which the "choose how much to allocate ahead of time" strategy can be seen to fall on its face in a mutitasking system:
1. Let's say you have 8 GB of RAM.
2. Go into Edit - Preferences - Performance and set Photoshop's RAM for, say 7 GB.
3. Close Photoshop.
4. Run another big progam you don't run very often, e.g., VMware, that preallocates 4 GB.
5. Start Photoshop again.
Now Photoshop doesn't use a percentage of the current available RAM, but rather (with sufficient activity) it tries to use the saved AMOUNT of 7 GB, but of course there's nowhere near that much available at the moment because it's being used by another app.
This isn't theoretical; it's easy to demonstrate. All it takes is two apps that preallocate memory and think they're the centers of the computing universe. You may say that Photoshop tries to play nice when it sees excessive swapping, but systems actually slow to a crawl under this condition. Saying it and doing it are two different things.
The only workable solution is to put in a LOT of RAM and set Photoshop's RAM allocation to be whatever the SMALLEST amount of memory that will be available at any time you might want to run it, and install sufficient RAM to allow everything you could possibly run simultaneously to coexist.
This completely defeats the purpose of having dynamically allocated resources on a general purpose computer, don't you think?
Modern systems support more tasks and do better multitasking than ever before, and quite stably. We've already been talking about the reasons why (e.g., more cores). Photoshop's memory management design might have been more proper for the ancient Macintosh on which it was invented (and which only did cooperative tasking), but the "when we try to use the OS it blows" and "professionals run only Photoshop" arguments just don't hold water.
Chris, I do hope you'll consider making Photoshop act a little more grown up in the playground in the next version. Don't forget what the 'S' in CS6 is supposed to stand for.
-Noel
Copy link to clipboard
Copied
Now Photoshop doesn't use a percentage of the current available RAM, but rather (with sufficient activity) it tries to use the saved AMOUNT of 7 GB, but of course there's nowhere near that much available at the moment because it's being used by another app.
You're ignoring virtual memory and OS paging.
But having multiple apps with high RAM usage is exactly why Photoshop offers the user a choice on RAM usage. If two RAM hungry apps tried to be completely dynamic about their usage, things would be much, much worse (ie: they'd fight over the RAM and both would lose or page heavily). That's usually taught as a lesson in operation systems classes.
No, Photoshop's memory management is very much needed for current OSes.
Photoshop is already designed to work well in an environment with multiple applications running, even those that might be actively using a lot of RAM. We've been working with the OS vendors to improve things all along, researching and testing other methods, and tweaking our algorithms to work well with current systems.
Just because you don't understand the details, doesn't make it wrong.
Copy link to clipboard
Copied
Actually, that's what i did by watching the allocations in the resource monitor, a swag, but perhaps a good first approximation.
I've been watching the resource monitor all morning, as well as PS performance on a 70M file with 6 layers, one a b&W conversion I use to really tweak color!
It's been flitting around about 5.5G combination of standby and free. I'm sitting at 6.5G in Prefs. (90%)
I started to notice some hesitations in running changes so I saved, exited all programs opened PS and it seems smoother, but not by much. Right now with only PS and FF running, total Free and Standby is 5.8G, 2200 in use.
Later today, I'll run the settings in prefs to 95% and try again
Vitals:
Win7 64
Athlon IIx4 running 3G
8G ram 1333
Tiles at 1028
History at 100
nVidia 9500 GT chip.
Open GL activated.
Copy link to clipboard
Copied
Hi,
Have you tried bumping up the voltage to the RAM in the Bios? Maybe it is unstable with 8 gigs trying to be used?
Just my 2 1/2 cents worth of input for this week.
Copy link to clipboard
Copied
Hi,
Have you tried bumping up the voltage to the RAM in the Bios? Maybe it is unstable with 8 gigs trying to be used?
Just my 2 1/2 cents worth of input for this week.
It's dual channel and it's using both modules 24/7 no matter what I'm doing. My motherboard is made to use 8GB so I don't see that being a problem unless I were overclocking (which I don't have the option to do via BIOS)
Copy link to clipboard
Copied
Curious. What board, processor?
DC9, bumping the memory voltage did not occur to me and is rather unimportant as the slight gain to hold on to the memory speed I had previous to adding the 4G isn't worth the hassle. If I ran into marginal stability anyway, I would have looked into the voltage parameter.
I usually err on the side of stability. If I want additional memory speed, I'll likely buy 1666MHz.
Emem is stock at 1.5v.
Copy link to clipboard
Copied
Curious. What board, processor?
I assume you're talking to me since I'm the only other guy talking about his motherboard, hehe
I have a Gateway P7805u laptop with a PM45 chipset and a T9900 CPu (3.06GHz)
Not sure what the actual board is, never looked it up. Probably some Gateway part #.
Copy link to clipboard
Copied
Here's what I found about the Gateway:
Gateway® P-7805u FX Edition
Processor | Intel® Core™2 Duo Processor2 P8400 (2.26GHz, 1066MHz FSB, 3MB L2 Cache)1 |
Operating System | Genuine Windows Vista® Home Premium (64-bit) with SP13 |
Memory | 4096MB 1066MHz DDR3 Dual Channel Memory (2-2048MB modules)4 |
Hard Drive | 320GB 7200RPM SATA hard drive5 |
Chassis | Chassis with NVIDIA® GeForce® 9800M GTS Graphics with 1GB of GDDR3 Discrete Video Memory and Intel® PM45 Chipset4 |
Copy link to clipboard
Copied
With his specs listed looks like he did a custom order? If this laptop had Vista on it and he loaded Windows 7 on it he may need to update the bios to run Windows 7 correctly. Even my motherboard needed a bios update because 8 gigs of RAM would not be read correctly if I loaded Windows 7 on it. 4 gigs was fine but 8 gigs caused a problem. This is when I went from Vista to Windows 7 (8 gigs was fine on Vista).
Anyways Lawrence, I know you are aware of all this stuff since you "Tinker". I just wrote it out for the poster to read so he will know that sometimes Windows 7 needs a little more love to run correctly (if the poster did in fact load Windows 7).
Copy link to clipboard
Copied
According to the spec, he has Vista 64 Home.
Ok, you saw that.
Cripes! We are on Page 3!
BTW, I did consider the possiblity that perhaps I may need to re-install Win 7, but after talking with the AMD tech, (very capable person) I saw what needed to be done.
Find more inspiration, events, and resources on the new Adobe Community
Explore Now