Skip to main content
Inspiring
November 22, 2023
Question

Creating Layers/ Altering pixels from raw data in an AEGP.

  • November 22, 2023
  • 1 reply
  • 1136 views

Having a hard time coming up with some solutions for this one...

I'd like to be able to pass raw image data into an AEGP, and use that to either alter existing pixels in items, or create new footage/layers with the image/video/whatever data. 

 

I have an ImageData struct like so;

struct ImageData {
    std::shared_ptr<std::vector<uint8_t>> data;
    int width;
    int height;
    int channels;

    // Default constructor
    ImageData() : width(0), height(0), channels(0), data(std::make_shared<std::vector<uint8_t>>()) {}

    // Constructor with shared_ptr and dimensions
    ImageData(std::shared_ptr<std::vector<uint8_t>> d, int w, int h, int c)
        : data(std::move(d)), width(w), height(h), channels(c) {}

    // Constructor with initialization
    ImageData(int w, int h, int c) : width(w), height(h), channels(c) {
        data = std::make_shared<std::vector<uint8_t>>(w * h * c);
    }
};

 

From what I'm seeing I need to use a file path for actually adding new layers and items. 

 

I've considered possibly using a null or solid layer, then replacing that, or possibly even trying to embed an effect plugin to be added to layers and called by the AEGP when necessary?

From some initial tests, it is super slow getting AEGP_WorldH across all frames in a comp, but it is definitely doable. Same with altering the AEGP_WorldH with data passed in. 

Perhaps AEGP_IterateGeneric could help?

 

At this point I'm kind of leaning towards the "embedded plugin" idea, since that could utilize multithreading, and the only communication between it and the AEGP would be done from a single threaded call, thus eliminating potential race conditions and issues with multiple AEGP instances?

 

TLDR; I'm exposing the SDK to python and I want to give some options for adding and manipulating custom image/video/audio data ( both gathered and passed in) so that people can utilize the massive amount of python libs to create some really cool stuff, and I need some assistance figuring out some alternative approaches to doing so.


Any suggestions would be welcome.

 

Thank you! 🙂

This topic has been closed for replies.

1 reply

Community Expert
November 24, 2023

well... all that very much depends on what user experience or workflow you're aiming for.

if it's an "overall" python script that governs the entire project, a "once off" script that does a task when asked to, or a filter that chages the layer image like an effect.

 

let's talk technical concepts.

an AEGP can request an item's buffer and read it. can can the AEGP *change* that buffer? the buffer passed on to the AEGP may be chaced by AE for use, it may be a COPY of the cached buffer, and it may be a temp buffer entirely uncached. so changin the passed buffer is highly likely to not show up anywhere, and even so the changed buffer may be discarded and regenerated without any detectable indication.

an AEGP can create a new image file on disk. that image file can be imported back into AE and placed in a comp, but as i said, that's a different user experience and workflow.

 

effects are passed the buffer of either the layer or the preceding masks/effects, and are also handed an output buffer to write the output to. that buffer can be passed to the AEGP for it to write into. however, that has to be synchronous, as once the render call has exited there's not telling what AE will do with that buffer, so it must be filled before returning.

also, the fetching of the input buffer to the effects is much more optimized than requesting said inputs directly from an AEGP. it's not that AE renders faster for effects, it's that AE can predict what effects would need and therefore cache required inputs, while it can't predict what AEGP would need and therefore usually renders inputs on request.

 

so, perhaps if you described the workflow from what you'd like the user's point of view to be, then maybe i could help you strategize technically.

Inspiring
November 26, 2023
Project Overview and Workflow:

My project aims to enable users to write scripts in Python to create and manipulate custom image, video, and audio data within After Effects. The current functionality allows users to select and run scripts through a menu command, with plans to evolve this into a more interactive and GUI-based approach. It could be a global script, or it could be a simple task based script. Best way I can put it is it works nearly the same as running extendscript from file->scripts->run, and end-goal is exposing that functionality to "CEP" ("PYE") extensions as well, effectively offering an alternative to extendscript. 

Technical Approach and Error Handling:

I'm working with pybind11 for Python integration and wrapping SDK functions in a Result<T, A_Err> struct for error handling. If an SDK function fails, the error is propagated up to Python, allowing for appropriate handling. Result structs are thread safe, and exposed python classes use std::shared_ptrs for memory management. I'm wrapping the entire SDK in single-concern functions like this;

```c++
Result<AEGP_ItemH> getActiveItem()
{
    AEGP_SuiteHandler& suites = SuiteManager::GetInstance().GetSuiteHandler();
    A_Err err = A_Err_NONE;
    AEGP_ItemH itemH = nullptr; // Initialize to nullptr in case AEGP_GetActiveItem fails
    ERR(suites.ItemSuite9()->AEGP_GetActiveItem(&itemH));

    Result<AEGP_ItemH> result;
    if (err != A_Err_NONE) {
        result.value = nullptr;
        throw std::runtime_error("Error getting active item. Error code: " + std::to_string(err));
    }
    else {
        result.value = itemH;
    }
    result.error = err;   // Set the error

    return result;
}
```

Regarding the wrappers-- anything that stands out that I could be doing a bit better here for error handling?

Performance Concerns:

A primary focus is on optimizing performance, particularly regarding UI blocking, processing speed, and memory management.
I've solved the UI-blocking issue by putting python on a separate thread, and calling into the idleHook with 1:1 command calls. (Each exposed python method sends one call into the idleHook queue, which even with multiple items in the queue, will return with each call to prevent blocking.)

Python-AE Communication:

The plugin uses a thread-safe queue for communication between Python and AE, ensuring a non-blocking UI.

Frame Handling in Effect Plugins:
Here's how I envision the frame handling working out: In the effect plugin, frames are gathered and placed in a struct that includes the frame number for accurate handling in Python. They are then processed by Python and sent back. Another approach is for users to write a custom callback function, used by the effect to implement the desired effect. Here's an example in Python demonstrating this concept:

```python
from PyShiftCore import *

def some_callback_function(**args) # create a callback function
    # processing pixels here
    return ImageData

comp = app.project.activeItem  # check for the activeItem

if isinstance(comp, CompItem):  # if comp is actually a composition

    layer1 = comp.layer[0] # get the first layer

    layer1.customEffect(some_callback_function(**args)) # pass a custom callback function to pass into the effect, which will then be used to call into python to perform processing.

    layer1.requestFrames(list[int] frameNum) #request a frame at a given time
    layer1.requestAllFrames() #request all frames from the layer
    layer1.replaceFrame(int frameNum, ImageData frame) #replace frame at given time
    layer1.replaceAllFrames() #replace all frames in layer.

else:
    app.reportInfo("Select a Composition first!") # if comp is not a CompItem, the user needs to select a comp instead.

```

Regarding your mention of AEGPs writing into the output buffer, this synchronous approach seems promising, especially considering the challenges with the Python GIL. My main objective is to streamline the transfer of pixel data into numpy arrays for extensive processing capabilities.

Future Plans and Challenges:
The long-term goal is to enable users to create custom effects, transitions, and standard manipulation through Python, eliminating the need for extensive C++ plugin development. I have established a solid foundation with numerous working classes, attributes, and methods. In the near term, I aim to finish wrapping the SDK functions, ensuring proper error and memory handling, and then flesh out PyShiftCore to match most attributes of extendscript.

Hopefully this paints a bit clearer of a picture. If there's anything that needs further clarification or if you have any specific suggestions or feedback, especially regarding workflows for the custom processing and object lifetime, please let me know!!

Thank you for your time and consideration!
Community Expert
December 6, 2023

Okay! So, I think I came up with a solution, and Id be keen to hear your thoughts--

End Goal:
From my AEGP plugin, allow the user to script the creation of a "custom" effect. 


How I plan to do it:

On the AEGP & Effect Plugin Sides:

Embed a python intepreter. 

Expose classes like so;

// Slider class equivalent
class Slider {
public:
    Slider(const std::string& name, int minS, int maxS, int increment, int default_val)
        : name(name), minimum(minS), maximum(maxS), increment(increment), default_val(default_val) {}

    std::string name;
    int minimum, maximum, increment, default_val;
};

// Params class equivalent
class Params {
public:
    std::vector<Slider> sliders;
};

// CustomEffect class equivalent
class CustomEffect {
public:
    CustomEffect() : name(""), elements(), callback_function() {}

    std::string name;
    Params elements;
    py::function callback_function;
};

This way, I can provide a simple interface to dynamically change the UI elements, and what "effect" is being applied. 

 

From the Effect Plugin Side:

Create a "Pool" of UI elements, hidden when first applied. AFAIK I can't dynamically add/remove elements, but I can hide/show existing ones. 

Initialize a 'CustomEffect' instance within inData (This can be done through globalData, correct?)
Embed the python interpreter. To test things out initially I'm only working single-threaded, but my idea for multit-threading is as follows:

 

Determine the # of threads for the effect, and initialize that many python processes in a pool. 

When the render command is called, it will use the pool, adding and removing things from the queue as it finishes them. 

 

The Effect will receive PF_Cmd_COMPLETELY_GENERAL, which will contain the CustomEffect data sent from the AEGP.

It will then take this, modify the InData->GlobalData refcon we stored earlier.

The RespondtoAEGP function will update the UI with the required elements and arguments (keeping all others hidden). It will also cast the callback function to the refcon.

 

In the render function, it will conditionally check for a function there. If it exists, it will use it. (The assumption is that the order of params in the func matches the order and # of params in the UI, which will be documented and required in the API). If it doesn't exist, it'll just push input to output. 

From the AEGP Side:
After passing the Information over, the AEGP simply receives a response indicating success or failure, so that it may continue executing, if it needs to. From this point, we now have an entirely custom effect applied! (Though I'd like to figure out how to maintain state for the project somwhow)


i think there's some confusion here between inData and sequence_data.

in_data is the structure passed by AE to an effect plug-in on every call. it contain's information relevant to that call, and is valid for the durtion of that call.

sequence data is the "custom storage" handle for each effect isntance. the that handle is passed to the effect via the in_data struct on every call. that data can be "flat" (Serialzied) or "unflat" (deserialzed).

 

i also didn't quite understand the use of global data here. yes, global data is a good place to store some "brains" for all instances together, but we're also talking about an AEGP which is (usually) a separate plugin. so i think there might be some confusion between the two here as well... or i just didn't understant the workflow you described.