Copy link to clipboard
Copied
Hello!
We are trying to blend 2-3 material subgraphs in Designer in order to create more complex ground materials with them, but even if we optimize the graphs with all the recommendations we can find in the documentation page it's still very slow when we edit parameters at the end of the graph, even for a simple level. I also solved all warning and error in the console and increased the memory budget in the parameters (but it wasn't very high anyway, 1 to 10%). It's the same for different graph using different materials and on other PC also. Tried with GPU and CPU engine...GPU drivers are up to date.
I'm pretty sure it's not related to the hardware since I'm working with a RTX 3080 GPU, Threadripper PRO 3945WX CPU and 64gb of RAM.
It feels like nothing is kept in cache and is recomputed everytime. In fact, enabling or not the ''Enable node image cache'' checkbox results in the same time which is around 9-10 sec for a 1024x1024 graph.
Sure its a big graph (a total of 1500-2000 ms when I put everything together), but 9-10 sec for a simple change at the end of the graph seems odd to me as the upstream nodes don't need to be recomputed (and 2000 ms should take 2 sec?).
Is it something normal? We had the impression that it wasn't like that before, but it could be just an impression. We noticed that when we import the same materials and blend them in Painter instead, it's way faster. But sadly Painter isn't an option in this case because of other limitations.
The only option we found is to export the outputs and reimport them as bitmap to be able to work efficiently which is obviously not really convenient.
Thanks in advance for your help! 🙂
Hello @Pierre-Alexandre5C98,
I have some news regarding the performance issues: the graph is impacted cumulatively by two known issues related to cache.
Hello @Pierre-Alexandre5C98,
I hope you had a relaxing weekend! Regarding each of your questions:
By ''values'' do you mean values specificaly coming from a Value Processor or value input i.e green links in the graph?
I mean the "value" data type, i.e. all green links in the graph, including the output of Value processor nodes.
Do you think converting these to grayscale values from 0 -1 would solve the problem instead of using floats or it would be essentially the same?
Since the issue lie
...Copy link to clipboard
Copied
Hello,
Thank you for the detailed report and for your patience!
It appears the entire graph is being pre-processed (or "cooked") each time anything in it changes, which should not happen. Graphs in published Substance 3D asset files (SBSAR) are already cooked, hence why this would not be an issue in Painter.
Can you share the Substance 3D file (SBS) related to this issue? If you can, please share a file which includes its dependencies. You may do that by right-clicking on the package in the Explorer panel and selecting the Export with dependencies option in the contextual menu. In the exporting dialog, check the Build archive option for a convenient, single shareable archive file.
Best regards.
Copy link to clipboard
Copied
Hello! Thanks for the reply.
Is there a way I can send it to you directly?
Copy link to clipboard
Copied
Hello @Pierre-Alexandre5C98,
There is indeed! I have sent you an email with a link to a dedicated folder for safely uploading files to me exclusively.
Best regards.
Copy link to clipboard
Copied
Thanks! I uploaded the files.
Don't hesitate if you need anything else 🙂
Pierre-Alexandre
Copy link to clipboard
Copied
Hello @Pierre-Alexandre5C98,
I have received the files, thank you for sharing!
I can reproduce the performance issues and will investigate this with our engineers. I will get back to you with our findings.
Best regards.
Copy link to clipboard
Copied
Hello @Pierre-Alexandre5C98,
I have some news regarding the performance issues: the graph is impacted cumulatively by two known issues related to cache.
Both of these issues add up into what you experienced: the majority of the graph needing to be entirely recomputed any time a change is made on most of its nodes.
I can offer two suggestions to mitigate this:
I understand these are frustrating issues to deal with, and both are a result of the current implementation of graph evaluation, cache management and the Substance Engine itself. One of our engineers has been hard at work for a while now to improve some of these components, which is a challenging technical endeavour.
I hope this is informative, and I appreciate your patience!
Best regards.
Copy link to clipboard
Copied
Thank you very much! We'll try your suggestions.
For the SBS instead of SBSAR it shouldn't be a problem as we mostly use SBS usually (even if it wasn't the case in the graph I sent you)
By ''values'' do you mean values specificaly coming from a Value Processor or value input i.e green links in the graph?
We use these a lot to reevaluate the scale in the graph after each material blends and convert the result to world units and generate our AO, normal, etc. Do you think converting these to grayscale values from 0 -1 would solve the problem instead of using floats or it would be essentially the same? This way I could transfer the information with a grayscale inputs instead of a value input.
I understand this is very technical, but do you have an idea (even if it's vague) of when we'll see these improvements?
Copy link to clipboard
Copied
Hello @Pierre-Alexandre5C98,
I hope you had a relaxing weekend! Regarding each of your questions:
By ''values'' do you mean values specificaly coming from a Value Processor or value input i.e green links in the graph?
I mean the "value" data type, i.e. all green links in the graph, including the output of Value processor nodes.
Do you think converting these to grayscale values from 0 -1 would solve the problem instead of using floats or it would be essentially the same?
Since the issue lies specifically to how the "value" data type is handled by the engine, then replacing all values by greyscale bitmaps would indeed eliminate the issues stemming specifically from values. You may use 16*16 pixels bitmaps to minimise the calculation time and memory overhead.
Do you have an idea (even if it's vague) of when we'll see these improvements?
The graph processing and cache improvements are an ongoing effort which hopefully should start bearing fruit in this year's summer release, with improvements in the graph invalidation – i.e. the process of assessing which parts of the graph actually have outdated data and need to be recomputed. Other improvement will be implemented across multiple releases.
The issues specific to the "value" data type are architectural and need a significant rework of both graph evaluation and the Substance engine. While we are unable to commit to any ETA for improvements on that side, we are aware of the impact the current limitations have and do wish to improve it in the future.
I hope this adequately answers your questions! Feel free to let me know if any other come to mind.
Best regards.
Copy link to clipboard
Copied
Hello! Again, thanks for your answers!
I did some tests last week with the grayscale input solution and in fact, it seemed to solve the performance issue in case someone reads this thread and is looking for an alternative.
Looking forward to this summer release 🙂
Copy link to clipboard
Copied
Hello @Pierre-Alexandre5C98,
Thank you for sharing, I am glad this discussion was helpful and the grayscale inputs solution works well for you!
Feel free to get back to me if you have any other questions of feedback.
Best regards.