We are using InDesign Server 2020 and with help of scripts(jsx) we are able to build a catelogue with different templates. Templates are build using Textframes, Tables and Graphic Frames, each template may have 300+ controls (Tables ,Textframes etc). All of these controls are grouped using InDesign Group control.
In main catelogue file (indd) we have 100s of different templates(indd) when we copy the contents of template (one Group) and move in main categlouge file its taking much time like morethan one minute per template (100 minutes for all templates).
Follwoing is the script which copies Group from one document and place in main document
var sourceGroup = sourceDocument.groups.item(0);
Is there any other alternate method which can improve performance for moving groups from one document to another. One thing we have noticed if the templates are samller in terms of size/controls it works really well and duplicate works with in seconds but for heavier template it takes minutes to dupicate and move in main document.
When you have problems copying from a big file while you don't see the problem from a small file, you already have your solution. Turn your source document into multiple small files, e.g. once when the designers change it. Alternatively, export your tempates/groups as snippets, or store them in a library and place from there.
Without watching the actual process on actual documents with a debugger/profiler, anything more is just guesswork. Could be matching of styles, could be matching of layers (if that is enabled), missing fonts in contained styles, broken links, exhaustive XML structure, a ton of other features. Also: corrupt source document, version or platform mismatch requiring conversion.
You suggesstion for snippets works really well and time reduced to seconds, thank for that.
Now we are observing delay in save, close and PDF export (background tasks already disabled on InDesign server) of final document after the modifictions. On average saving a document takes 20 -30 seconds, closing the same document takes more than a minute and PDF export (small preset) takes 4 - 7 minutes for a 32 page document.
Basically we are passing json data to InDesign script using SOAP service and based on received data, script modify the elements of document like images , tables (templates) etc. At the time we are processing document updates for 1 - 3 spreads in single call (updates in chunk). There can be many updates like 30 spreads so which invokes InDesign script 10 times with different data chunks (data for 3 spreads in each call).
Is there a way to optimize open/save/close and PDF export time for a single document.
If the involved documents are on a shared volume, use a local volume instead.
Read about RAM disks, those are also a good idea to reduce wear on SSDs …
Also have a look at the undo modes of app.doScript, but I think that won't make a difference on InDesign Server.
Then, watch the memory used by your Server, if the GBs sum up, give it a restart every now and then.
Otherwise I can only repeat my second paragraph. Could be anything - use appropriate tools to measure. E.g. on Mac Xcode has a sub-program called "Instruments", but even process analytics from the Activity Monitor will do. The server's scripting object model also has some additional metrics you can watch. I'm right now working on my first Server project in years, Windows to come next month, so others may be more helpful here.
Apart from Dirk_Becker's solution of splitting the Document into many smaller Documents for individual Objects/Groups I have a different idea.
I'm using an Indesign-Plugin for automated Publishing (Database Publishing). This Plugin utilizes Indesign-Libraries (not the Cloud libraries but the local ones). The different groups of Objects can be saved in there. When creating the catalogue the library will be opened and the groups/objects are placed on the page.
Maybe you can implement that process in your Indesign Server script. After all Indesign Server is just Indesign without UI.