• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Can't play realtime in 32bit mode.

Contributor ,
Feb 19, 2024 Feb 19, 2024

Copy link to clipboard

Copied

I have been working in ACES in AE for 5 years now.

The workflow now is pretty easy so congratulations to the team.

One thing that bothers me.

I have a 7950X 128GB RAM RTX4090.

I can play 4k, 6k, 8k comps in realtime when in 8 and 16 bit mode, but when I switch to 32bit mode, most of the time the playback can't reach 25fps when in 4k comp, have to drop to half resolution, sometimes lower.

Is there any reason for this?

I mean the project is calculated in 32bit so it takes more processing - but once it's calculated what's stopping it from playing realtime?

32bit takes more RAM, but once it's cached and entirely in RAM it shouldn't interfere with realtime playback.

It's not like the image is being sent to the monitor in 32bits as there is no standard like that - the rendered frames are still sent in 10bit no matter if the project is calculating in 16bit or 32bit - so why the drop in performance?

 

Also it seems the performance is slower the bigger the preview is:

- playing at 50% comp viewer zoom is smoother than in 100%

- playing at 100% zoom on a large comp viewer window is slower than 100% on a small comp viewer window (in this case a smaller part of the frame is visible.)

 

Why does the number of pixels on the screen that the comp view takes influence the performance?

Every other software, including file sequence players like DJV shows no such behaviour. If it is able to playback the sequence in realtime in 50% zoom, it will playback realtime in 100%, 200%, and 800% if I want.

TOPICS
Performance , Preview

Views

69

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 19, 2024 Feb 19, 2024

Copy link to clipboard

Copied

The issue is that you're not using a duo-core CPU. The GPU doesn't matter, AE doesn't actually use it in most cases, and was never originally designed to do so. Most of the effects aren't multi-threaded or don't use GPU. If you were to have two CPU's of the same size and archetecture, theoretically, the one with closer to two cores would waste the least of its processing power.

 

That said, the program is from before discrete GPU's, so it also uses CPU for both updating the UI and calculating effects. Theoretically, lots of cores mean lots of simultabeous frames, but all of it's useless when you need to wait 5 seconds for 128 consecutive frames, and due to scheduling, you need to stop at the 12th, 27th, and 29th frames till another is done because it won't render till the playhead is waiting on it. For reasons.

 

As well, caching is the least efficient possible setup for modern hardware. The footage is loaded uncompressed in RAM, and after the CPU has processed frames, they're dumped in RAM too. Note that putting something in RAM is yet another thing that taxes the CPU.

 

It's not you, it's After Effects, and if you wanna really take advantage off all that beef, look at Nuke for comp, Davinci for color/edits, and Blender for 3D. In my experience, those three scale with hardware. You can look up puget systems- even they note AE does NOT scale woth better or newer hardware.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Feb 29, 2024 Feb 29, 2024

Copy link to clipboard

Copied

Are you serious?

Nothing you said relates in any way to what I asked.

Almost none of the sentences you wrote are true. The only sentence I can't argue with is "you're not using a Duo Core CPU" - everything after that is either partly or completely false.

I work both in AE and Nuke and if you think Nuke is faster than AE - well then you clearly don't work professionaly in at least one of these programs.

There is no way of playing anything in Nuke without stutters, 8k? 4k in ACES? forget about it - you'll get 10fps and be thankful.

Resolve, Blender - why bring them up at all?

You forgot to mention your favorite web browser as well.

Caching data in Ram is the least efficient? Care to elaborate what other methods exist? Caching to NVMe, or maybe realtime rendering???

 

This is a simple inquiry about a problem that is bugging me.

An entry to reporting a bug, because it seems there is one, but I'm not going to report it as a bug until I get confirmation from other users that it's also happening on their machines.

 

Seriously, why are you even here, why respond?

Oh by the way, you forgot to mention that Windows sucks, and it would work perfectly on Linux if Adobe ever ported AE to linux.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Mar 01, 2024 Mar 01, 2024

Copy link to clipboard

Copied

LATEST

We have an identical set up 7950X 96gb RAM And 4080  and I use ACES color management but I don't seem to have that issue in 32 bpc.  I get real time playback with heavy effects and using ProRes 4444 and ProRes 4444XQ clips.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines