Exit
  • Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
  • 한국 커뮤니티
0

GPU API

Participant ,
Dec 14, 2025 Dec 14, 2025

Quick question about the GPU API. Does AE force CPU rendering if buffer expansion happens? Done some testing with debugging and it seems like SmartRenderGPU never gets called when expansion happens. Do I have to do something specific or anything, or is it not possible?

TOPICS
SDK
771
Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Participant , Jan 05, 2026 Jan 05, 2026

https://drive.google.com/file/d/1zIFDmQlOC2zQuwCkYI6ZTnAbIsDDR49N/view?usp=sharing

I finally fixed the issue so I decided to share one of my old plugins which I just modified to handle buffer expansion on both the CPU and the GPU. Everything you need to see is in that project. Just make sure to set PF_RenderOutputFlag_RETURNS_EXTRA_PIXELS with "|=" NOT "=". If you set it with just "=", the plugin will just automatically render on the CPU if expansion is happening.

Translate
Participant ,
Dec 14, 2025 Dec 14, 2025

Also a quick note. It seems to have something to do with this flag PF_RenderOutputFlag_RETURNS_EXTRA_PIXELS.

If you set it like this "|=" expansion happens on GPU as well, but it completely corrupts the image for non-transform effects, like blurs.

If you set it like this "=" expansion forces CPU rendering.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Dec 18, 2025 Dec 18, 2025

Sorry I don't have an answer to your question, but in the following case:

If you set it like this "|=" expansion happens on GPU as well, but it completely corrupts the image for non-transform effects, like blurs.

Is it corrupting the input buffer, or is it the case it's junk memory in the output buffer?

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Dec 19, 2025 Dec 19, 2025

I am not sure what it corrupts but it looks like this:

 

Screenshot 2025-07-10 174929.png

 

But I am clearly doing something wrong because there are built-in effects in AE that are GPU Accelerated and also have expansion that happens on the GPU, not on the CPU. Effects like Directional Blur, Gaussian Blur, Transform.

 

Some extra info I can give about this is that that corruption you see in the image happens when I apply my DBlur effect above my Exposure effect, otherwise if I apply it alone or with other effects, it does not happen. This kinda proves that I am doing something wrong, not that it's not possible. Also sad that no one really got experience with the GPU API.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Dec 19, 2025 Dec 19, 2025

You have probably already tried this but just to be safe, are you explicitly clearing all buffers before use?

 

Could it be the case that your upstream exposure effect is causing it if other 1st party effects don't give the issue?

 

Yes not many people have experience with the newer GPU API. I've played around with it but never shipped any plugins on it. 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Dec 20, 2025 Dec 20, 2025

Yeah, I've tried. And I've also just tested with my other effects, it happens with every single one. Like if I apply my DBlur above any other of my effects, it just corrupts the image. If I apply only the DBlur or I apply it with AE's built-in effects, it does not happen. I'll actually try some debug to see what values I get on the CPU path versus on the GPU path for like everything.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Dec 20, 2025 Dec 20, 2025

Just found these errors in AE's logging:

Ticks = 42641 <24380> <GF.CUDAError> <5> CUDA Error: result: CUDA_ERROR_ILLEGAL_ADDRESS, detail: {"first": false, "error": "CUDA_ERROR_ILLEGAL_ADDRESS", "detail" : "cuMemAlloc", "deviceMemory": {"outstandingSize": 0, "allocatedSize": 0, "pooledCount": 0, "allocatedCount": 0, "softPoolLimit": 3221061632, "hardPoolLimit": 4831592448}, "hostMemory": {"outstandingSize": 2334208, "allocatedSize": 2334208, "pooledCount": 0, "allocatedCount": 1, "softPoolLimit": 854173696, "hardPoolLimit": 1708347392}, "videoMemory": {"budget": 5450498048, "usage": 780791808}}
Ticks = 42641 <24380> <GPUFoundation> <0> Unable to allocate device memory of size: 2mb, current size: 0mb

These happen when I apply the effect with AE's built-in effects too. It just shows a black image and falls back to CPU rendering.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Dec 20, 2025 Dec 20, 2025

=== EXPOSUREGAMMA PRERENDER ===
Input Rect: L=-108 T=-135 R=1188 B=1485
Input Dimensions: 1296x1620
================================

=== EXPOSUREGAMMA GPU RENDER ===
Input World: width=2538, height=3078
Output World: width=1296, height=1620
================================

Ye I think this is really, really wrong.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Dec 20, 2025 Dec 20, 2025

Yep, I fixed the issue. Shortly:

The SmartRenderGPU needs to use output_world->width/height not input_world (this fixes the corruption/buffer sizes mismatches):

// Use output dimensions, not input dimensions.
// Input may be larger when upstream effects expand the buffer
eg_params.mWidth = output_worldP->width;
eg_params.mHeight = output_worldP->height;

The kernel needs to account for the buffer offset. (this fixes the offset):

// Calculate offset to center the output in the input buffer
int offsetX = (inInputWidth - inWidth) / 2;
int offsetY = (inInputHeight - inHeight) / 2;

// Read from input at the offset position
int srcX = inXY.x + offsetX;
int srcY = inXY.y + offsetY;



(EVEN IF THE EFFECT DOES NOT EXPAND THE BUFFER, MY EXPOSURE/GAMMA EFFECT IN THIS CASE)

It's really weird that the AE built-in effects (that don't have expansion, like Exposure) do not do this leading to them not working at all with effects that expand the buffer on the GPU as well. Even tho, from what I can see, the built-in Directional Blur works with the built-in Exposure effect, idk how, but it does.

Another interesting thing I found is that the Directional Blur effect from AE gives my Exposure/Gamma effect matching input/output buffer sizes:

=== EXPOSUREGAMMA GPU RENDER ===
Input World: width=1296, height=1360
Output World: width=1296, height=1360
================================

While my Directional Blur effect gives it a 2x larger input buffer. So it clearly does some extra stuff before passing the buffer to the next effect. This is an issue I have no idea how to solve tho. Like why does my Directional Blur effect give the Exposure a 2x larger input world, but the built-in Directiona Blur doesn't.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Dec 28, 2025 Dec 28, 2025

So uh, any ideas as to why the input/output mismatch happens in the first place? If I can't fix that, my effects wont work with first-party effects.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jan 05, 2026 Jan 05, 2026

https://drive.google.com/file/d/1zIFDmQlOC2zQuwCkYI6ZTnAbIsDDR49N/view?usp=sharing

I finally fixed the issue so I decided to share one of my old plugins which I just modified to handle buffer expansion on both the CPU and the GPU. Everything you need to see is in that project. Just make sure to set PF_RenderOutputFlag_RETURNS_EXTRA_PIXELS with "|=" NOT "=". If you set it with just "=", the plugin will just automatically render on the CPU if expansion is happening.

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jan 05, 2026 Jan 05, 2026

Hey @dkt0 , I do not have any experience developing plug-ins but it is something I have been keen to learn about. I found this thread quite interesting from a learning standpoint, so thank you for the updates you've provided throughout your troubleshooting process. I was just wondering if you might be open to explaining how you were ultimately able to solve this input/output mismatch?  Thanks. 

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Jan 06, 2026 Jan 06, 2026
LATEST

Yo, glad you found it interesting. Ever since I started making plugins, buffer expansion was the hardest thing I encountered and that's because there isn't any sample plugin that does it via the SmartFX API. Anyways, it might be a lot to read, but here is a text file that explains everything. If you need further help you can reply here or dm me (here or on discord: dkt0).

Quick note: that part where I said it can be skipped during PreRender...after more testing, you should not skip it 🙂

Translate
Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines