After about 30 minutes of using Lightroom Classic it seems that VRAM usage spikes to nearly all available and then the program goes absolutely chug in performance. When I first start it runs amazing, very snappy, but quickly this deteriorates to being unusable and I have to restart the program. I've noticed also that other programs on the computer will start to suffer as there is no vram/gpu available for anything else, and at least once it has caused a full on BSOD.
RTX 3080 - 1920x1080, Ryzen 3900X, 64GB DDR4 and on a Samsung 970 Evo+ SSD.
The issue you describe is known to Lightroom Classic engineering team and is currently under investigation. Typically, the issue is more common on systems with 4GB or more VRAM, and becomes progressively worse when GPU card is fitted with even larger amounts of VRAM. Unfortunately, there is no workaround at present.
BTW, I note that you've also submitted a report on the Lightroom Classic Feedback forum. This is helpful as it is much more likley to be picked up by Adobe staff than would be the case in this forum. You 'may' also find this your post gets merged into an existing thread already discusiining the issue.
I see this too, but I have found a workaround - uncheck "use GPU for image processing":
This reduces normal VRAM usage from over 90% to around 25%, with no other discernible ill effects.
Now, I should stress that while I have the high VRAM usage, I haven't really had much performance problems. The only one that bothers me a bit, is actually seen in Photoshop - image windows tend to hang and "ghost" for a second or two when closing them. That immediately stops when I uncheck as per above.
It is not necessary to do the same in ACR, and it still runs with full GPU use. This seems to be very Lightroom-specific.