Skip to main content
Participant
September 3, 2022
Answered

Why does my script never complete?

  • September 3, 2022
  • 2 replies
  • 352 views

Hi all

 

I'm fairly new to scripting, but not InDesign.

 

I've adapted a script I found online to help me with a document created via Data Merge. The merged document has about 1900 pages, half of which have a 26 row / 3 column table on them.

 

The script is small, and is designed to remove any blank rows from those tables.

 

Testing on one page was fine - it took a second or so to run. But when I run it on the full, 1900 page document it never seems to end. It's been running about 5 hours now on my Win10, 64-bit, i7-6400U / 8gb laptop, and is still on the hourglass. Is it that my RAM is just insufficient?

 

I can leave it all night but have more documents to run, and would like to understand scripting performance better. So can anyone comment on why it may be taking so long?

 

var myDocument = app.activeDocument;

myDocument.preflightOptions.preflightOff = true;  // for performance

var c = "";
for(var i=myDocument.textFrames.length-1; i>=0; i--){
    for(var j=myDocument.textFrames[i].tables.length-1; j>=0; j--){
        for(var k=myDocument.textFrames[i].tables[j].rows.length-1; k>=0; k--){
				c = myDocument.textFrames[i].tables[j].rows[k].cells.everyItem().contents.join("");
				c = trim(c);
				if (c.charCodeAt(0) == 65279 && c.length <= 12){     // Empty row conditions
					myDocument.textFrames[i].tables[j].rows[k].remove();
				}
        }
    }
}
	
function trim (str) {
    return str.replace(/^\s+/,'').replace(/\s+$/,'');
}	

myDocument.preflightOptions.preflightOff = false;

 

What I especially don't understand is why processing a 1900 page document takes so much longer than 1900 x the time to process 2 pages.

I've closed all other programs and documents, turned preflight off, set display performance to fast...

Task Managr reports the CPU around 40% and memory does now seem to be all in use.

Is my only solution to quit the current process, split this into smaller runs and then combine after converting to PDF?

Thank you for any help

 

This topic has been closed for replies.
Correct answer Peter Kahrel

8Gb should be fine.

 

The problem with your script is that at each iteration you force Indesign to create a new collection:

 

for(var i=myDocument.textFrames.length-1; i>=0; i--){
    for(var j=myDocument.textFrames[i].tables.length-1; j>=0; j--){
        for(var k=myDocument.textFrames[i].tables[j].rows.length-1; k>=0; k--){
				c = myDocument.textFrames[i].tables[j].rows[k].cells.everyItem().contents.join("");

 

The first line executes as many times as there are text frames, the second, for every table; the third, for every row. The killer line is the fourth, in which all those line are executed again for every cell.

 

It's no surprise, therefore, that your script runs fast on a single page, and that its running time increases (logarithmically) with increasing tables and table sizes.

 

The quickest way to execute the script is to get all rows in a single call. Additionally, with so many rows to test you should precompile any regular expressions (to avoid their recompilation at every iteration). An finally, your test can be simplified.

 

Try the following script and see whether it makes any difference:

 

 

// Precompile the regular expression
re = /\s/+;
// Get all the rows in the document
rows = app.documents[0].stories.everyItem().tables.everyItem().rows.everyItem().getElements();
for (i = rows.length-1; i >= 0; i--) {
  if (rows[i].contents.join('').replace(re,'') === '') {
    rows[i].remove();
  }
}

 

 Peter

2 replies

Peter Kahrel
Community Expert
Peter KahrelCommunity ExpertCorrect answer
Community Expert
September 6, 2022

8Gb should be fine.

 

The problem with your script is that at each iteration you force Indesign to create a new collection:

 

for(var i=myDocument.textFrames.length-1; i>=0; i--){
    for(var j=myDocument.textFrames[i].tables.length-1; j>=0; j--){
        for(var k=myDocument.textFrames[i].tables[j].rows.length-1; k>=0; k--){
				c = myDocument.textFrames[i].tables[j].rows[k].cells.everyItem().contents.join("");

 

The first line executes as many times as there are text frames, the second, for every table; the third, for every row. The killer line is the fourth, in which all those line are executed again for every cell.

 

It's no surprise, therefore, that your script runs fast on a single page, and that its running time increases (logarithmically) with increasing tables and table sizes.

 

The quickest way to execute the script is to get all rows in a single call. Additionally, with so many rows to test you should precompile any regular expressions (to avoid their recompilation at every iteration). An finally, your test can be simplified.

 

Try the following script and see whether it makes any difference:

 

 

// Precompile the regular expression
re = /\s/+;
// Get all the rows in the document
rows = app.documents[0].stories.everyItem().tables.everyItem().rows.everyItem().getElements();
for (i = rows.length-1; i >= 0; i--) {
  if (rows[i].contents.join('').replace(re,'') === '') {
    rows[i].remove();
  }
}

 

 Peter

Participant
September 7, 2022

@Peter Kahrel - Thank you so much, that worked a treat! It's gone from needing to run overnight (I think it took 8+ hours in the end) to just over a minute.

 

For anyone that might benefit, here's the script I used in the end. Probably looks like it's been butchered to Peter, but I had to adapt it with my limited ability as my blank rows still had unicode non-breaking spaces in them (character 65279). 

 

// Precompile the regular expression
var re = '/\s/+';
var c='';
// Get all the rows in the document
rows = app.documents[0].stories.everyItem().tables.everyItem().rows.everyItem().getElements();
for (i = rows.length-1; i >= 0; i--) {
  c=rows[i].contents.join('').replace(re,'');
  if (c.charCodeAt===65279) {
    rows[i].remove();
  }
}

 

Peter Kahrel
Community Expert
Community Expert
September 8, 2022

The problem you had with my script was probably that you added quotes to the regular expression -- you should leave them out, see my post.

Not sure how you got your script to work: charCodeAt is a function, so it needs parentheses -- .charCodeAt()

The \s in my regular expression stands for any space, including non-breaking spaces.

Maybe try my version again -- who knows, it might run in under a minute 🙂

 

Robert at ID-Tasker
Legend
September 3, 2022

8GB is barely enough for Windows alone...

 

Then, there is a case of InDesign keeping unlimited undo - which takes time and resources.

 

Then, script should be processing Stories - not TextFrames.

 

By 26 rows / page you mean 26 separate TextFrames with single row tables? Maybe try to do one big table - by using WORD and its version of DataMerge? But it still be an overkill for 8GB RAM... 

 

The best option would be to split into smaller files - or modify script to do SaveAs with new name every 500 or 1000 TextFrames? This will purge Undo buffer.

Then script could turn off screen refresh and other things - there is a thread about that:

https://community.adobe.com/t5/indesign-discussions/speedup-script-for-indesign/td-p/13146358