• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

How to get a 'UTexture2D*' from the output of a 'USubstanceGraphInstance*'?

New Here ,
Feb 28, 2024 Feb 28, 2024

Copy link to clipboard

Copied

Hello,

 

I am using the Substance Plugin with Unreal Engine 5.3.

 

I have a USubstanceGraphInstance* with some input parameters that I customize at runtime before rendering the graph instance. Since I only want to read the pixel data from the generated output textures, I don't need to save the textures to disk.

 

I have two questions:

  • How do I register a callback function, or is there another way to get notified when the async rendering of a graph instance has finished?

  • Once the rendering has finished, how can I access the generated output textures as a UTexture2D*?

 

Thanks!

TOPICS
UE5

Views

472

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Mar 01, 2024 Mar 01, 2024

Copy link to clipboard

Copied

Hi Kontur,

 

You can access the textures using the USubstanceGraphInstance Pointer by the following codes below

 

for (const auto& output : USubstanceGraphInstancePointer->Instance->getOutputs())
{
	if (output->mUserData != 0)
	{
		UTexture2D* texture = Cast<UTexture2D>(reinterpret_cast<USubstanceOutputData*>(output->mUserData)->GetData());
		// Do something with the texture
	}
}

 

 Also, when the asynchronous rendering finishes for each texture, texture->PostEditChange() is called. Just listen to that event. That is the only event fired when the rendering is done from our own internal post render callback.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 27, 2024 Apr 27, 2024

Copy link to clipboard

Copied

Thank you for your swift reply. I am just now getting around to following up.

 

Unfortunately, PostEditChange() is an editor-only function, and we need this to work in a shipping build.

 

To provide more context, the output textures are used as heightmaps for procedural terrain generation in a seed-based world. Currently, we extend the Substance plugin with a query manager that handles asynchronous render requests and calls back the game code when UpdateTexture() is eventually called in SubstanceCoreHelpers.cpp (which seems to occur in shipping builds too). However, this approach feels very hacky and fragile, so I wonder if there is an "official" or at least a better way of doing it.

 

Ideally, when a new level is generated in the game, we want to instantiate the Substance graph that represents the terrain type of the level (each terrain type is represented by a different Substance graph) and we want to asynchronously and simultaneously render the level's heightmap in multiple chunks. To do this, each Substance graph has X and Y input parameters, which are used in the graph to render the appropriate chunk of the global level heightmap. We would read the pixel data from the output textures and then discard them along with the graph instances. When a new level is needed, the cycle starts again.

 

However, we are generally unsure about the proper workflow for runtime generation, so any insights you could provide would be immensely helpful!

  • Instantiating graphs at runtime seems to work, but there is a noticeable hitch when USubstanceUtility::CreateGraphInstance() is called. Is this expected, or are there ways to mitigate this?
  • Can a single graph instance simultaneously render multiple sets of graph outputs all using different input parameters for the graph? It would be really convenient to use a single graph instance to asynchronously render all heightmap chunks at the same time.
  • If runtime graph instantiation is a bad idea, and/or a graph instance can only render one set of inputs/outputs at the same time, what are our options? Having to create a graph instance for each chunk of each terrain graph type (there are hundreds) in the editor would be really suboptimal, and also wasteful, as we would have to ship the game with thousands of extra offline textures sitting around (graphs instantiated in the editor create a default output texture which they update on each render), when in practice, only one graph type and its seed-based outputs would be relevant at any given time, and only until we read the pixel data.

 

Thank you!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Apr 30, 2024 Apr 30, 2024

Copy link to clipboard

Copied

Hi @Kontur, we need to know a few more things about this particular workflow to provide guidance here. Could you let us know:

  1. How many graphs are you working with?
  2. What is the average output size per graph?
  3. How many graphs do you need to create at runtime?

 

Best,
Aldo

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 01, 2024 May 01, 2024

Copy link to clipboard

Copied

Hi Aldo,

 

Thank you for getting back to me!

 

1. Our project includes hundreds of graphs (.sbsar files). Each graph represents a terrain type and has inputs for Seed and XY coordinates, producing a single channel-packed basecolor output.

2. The full heightmap resolution is 16384x16384. We plan to composite it from 4x4k texture chunks (refer to the screenshot). Our goal is to divide the workload, allowing us to start rendering the terrain around the player as quickly as possible.

3. Ideally, we would instantiate only one graph at a time—the one associated with the terrain type of the currently generated map, and we would use this single instance to asynchronously and simultaneously render four different versions (same Seed, different XY coordinates), one for each chunk. If this is (understandably) not feasible, we would then want to instantiate four separate graphs, one for each chunk, and render them simultaneously.

 

I hope this clarifies things. If you need any more information, please let me know!

 

Thank you!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 01, 2024 May 01, 2024

Copy link to clipboard

Copied

Sorry about the screenshot, instead of 2x2, it shows a 4x4 grid, we are still experimenting with the best settings.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Jun 05, 2024 Jun 05, 2024

Copy link to clipboard

Copied

LATEST

Hi @kontur, I took a look at our substance creation during Play in Editor and there is definitely an improvement that can be made. It's following the same process as importing in editor and causing a render of the substance, we have a task logged to work on seperating these workflows to reduce the delay when creating a graph through blueprints. In the meantime, can you try packaging the scene and testing, full runtime shouldn't have the same issue when not considered Play in Editor.

Thank you,
Josh

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Resources
Documentation
Download Latest Versions