Multi-Frame Rendering Effect C++ SDK and OpenGL
Copy link to clipboard
Copied
Hi all,
I've got a third-party Effect for After Effect which extensively rely on OpenGL for rendering.
Even though the effect rendering process is fast, I still try to get rid of the warning icon introduced in latest After Effects version related to the "Multi-Frame Rendering" feature.
Although enabling that feature in an effect is quite straight-forward, it seems that this feature doesn't play well when the effect relies on OpenGL. I do now that using OpenGL with a multithreaded CPU host architecture requires some set up, but unfortunately I cannot find any guidelines in After Effects C++ documentation, nor in this forum.
One failure trial that I've tried, is "hacking" the class AESDK_OpenGL_EffectCommonData that's inside the GL_base.cpp and GL_base.h files, by mainaining some sort of OpenGL context cache for each thread by its id. The main idea is that OpenGL context shouldn't be shared across threads, but rather each should have its own. At each Render() call, I call for AESDK_OpenGL_EffectCommonData::SetPluginContext() which was originally used the same OpenGL context, and now I create a new one for each thread(if such hasn't created already).
I also tried to use the GLator sdk example provided for macOS, it compiles successfully but crashes immediatly when launching After Effect.
I'll appreciate any guidelines regarding of how to enable the "Multi-Frame Rendering" for a custom thrid-party effect that works extensively with OpenGL.
Copy link to clipboard
Copied
Most third-party effects I know simply implement their own GL clients and don't care for this stuff in the native AE API. Depending on what you do and how many threads there are I also don't think it is particularly viable to have a GL instance for each of them. You probably will have to re-design the entire internal functionality instead of just setting the switch. Just a few random thoughts, though...
Mylenium
Copy link to clipboard
Copied
I'm really sorry but that's not an answer..
How do you explain the GLator SDK example, that's defined by the SDK "a demonstration of proper OpenGL context management in an effect plugin" crashing immediatlry on launch?
Copy link to clipboard
Copied
Hi Avi
I was using the GLator sample extensively. I don't think there was any consideration to MFR in that sample as it was deprecated before MFR was launched. One context per thread is correct and there may be ~10 threads at any time. I had a lot of trouble with this so I found a more knowledgeable dev to implement it for me.
Copy link to clipboard
Copied
GLator crashes ("sometimes!") becauses it has a flagrant buffer overwrite error in GL_base.cpp when it reads in the shader files. It allocates for the file length, and writes a null-terminator past the buffer length. It's pretty sloppy. 😞 Code below is GL_base.cpp line 794-796.
bufferP = new unsigned char[fileLength];
int32_t bytes = static_cast<int32_t>(fread( bufferP, 1, fileLength, fileP ));
bufferP[bytes] = 0;
(Also, GLator's use of thread as a map key to GL context isn't how AE allocates threads. AE has a "Worker Pool" and the same effect instance can be assigned to any of those threads as needed... and other instances of same effect, too. S_render_contexts[t_thread] isn't what they really want. So that also causes crashes.)

