Skip to main content
Aprking
Inspiring
April 4, 2024
Answered

Why is manual link folder replacement faster than script in InDesign?

  • April 4, 2024
  • 11 replies
  • 7476 views

Hello everyone,

I have a question to ask. When I use this script to change the target folder of links, if there are many linked files, the entire script takes a long time to complete. However, if I rename the source directory first and then use InDesign's Links panel to replace the target folder for finding missing files, this method is much faster, only a fraction of the time it takes for the script to run.

What could be the reason for this significant difference in speed?

Is there any way to optimize this script?
Thank you all.

 

var originalFolderPath = "F:/source";
var targetFolderPath = "e:/target";

var doc = app.activeDocument;
var links = doc.links;
var updatedLinksCount = 0;
for (var i = 0; i < links.length; i++) {
    var link = links[i];
    var originalLinkPath = link.filePath;
    if (originalLinkPath.indexOf(originalFolderPath) === 0 || originalLinkPath.indexOf(originalFolderPath.replace(/\//g, "\\")) === 0) {
        var originalFileName = originalLinkPath.substring(originalLinkPath.lastIndexOf('/') + 1);
        if (originalFileName == originalLinkPath) {
            originalFileName = originalLinkPath.substring(originalLinkPath.lastIndexOf('\\') + 1);
        }    
        var targetFilePath = targetFolderPath + "/" + originalFileName; 
        //$.writeln("Target File Path: " + targetFilePath);
        var targetFile = new File(targetFilePath);
        if (targetFile.exists) {
            link.relink(targetFile);
            updatedLinksCount++;
        }
    }
}

if (updatedLinksCount > 0) {
    alert("Changed: " + updatedLinksCount + " links to files with the same names in the target folder!");
} else {
    alert("No links to update found!");
}

 

This topic has been closed for replies.
Correct answer rob day

Local hard drive, of course。

This is my test data: 20page×40links/4.52s

50pagex40links/14.752s

100pagex40links / 32.159s

365pagex40links/835.581s

The key is the number of links, but the processing time increases exponentially when it reaches a threshold.


Maybe try upating the links by page so you avoid such large arrays. This was 6x faster for me on a doc with 40 links per page:

 

vs. all doc links

 

 

var st = new Date().getTime ()
var targetFolderPath = "~/Desktop/target/"

var p = app.activeDocument.pages.everyItem().getElements()
var updatedLinksCount = 0;
var pg, targetFile;
for (var i = 0; i < p.length; i++){
    pg = p[i].allGraphics
    for (var j = 0; j < pg.length; j++){
        if (pg[j].itemLink != null) {
            targetFile = File (targetFolderPath +  pg[j].itemLink.name);
            if (targetFile.exists) {
                pg[j].itemLink.relink(targetFile);
                updatedLinksCount++;
            }
        } 
    };   
};   
var et = new Date().getTime () 
alert((et-st)/1000 + ": Seconds")

11 replies

Robert at ID-Tasker
Legend
April 4, 2024

This script is not very optimal... It uses indexOf() and GREP on each link - string operations - instead of just replace in try...catch.

 

If what you are looking for and what you want to change it with are constants - there is no need to check each link.

 

If you were relinking by some unspecified date or version or something variable - then yes, GREP would be needed. But in this case - it's a waste of resurces.

 

And instead of checking if destination file exists - you could also use "brute force" and try...catch - if destination file exists - will be relinked - if not - nothing bad will happen.

 

And another reason why InDesign is doing it faster - because is much "closer" to the system's core - JS needs to be translated/interpreted and then executed - so extra time is needed.

 

Aprking
AprkingAuthor
Inspiring
April 4, 2024

@Robert at ID-Tasker Thank you for your response. Actually, there are only dozens of linked files placed in my document, but these linked files are reused across hundreds of pages, eventually reaching thousands. This slows down efficiency when changing link paths. If possible, please provide specific modifications to the code. Thank you.

Aprking
AprkingAuthor
Inspiring
April 5, 2024

I have tried using try...catch in the script and found that it does indeed improve execution efficiency slightly. However, when processing too many pages, the speed is still incredibly slow. The real issue might be the large number of pages and links, which consumes too much memory and resources. Therefore, I am now attempting to split the document, processing a portion of the links each time, and then concatenating them together.

 

if (targetFolderPath) {
    try {
        var originalFileName = getOriginalFileName(originalLinkPath);
        var targetFilePath = targetFolderPath + "/" + originalFileName;
        var targetFile = new File(targetFilePath);
        if (targetFile.exists) {
link.relink(targetFile);
updatedLinksCount++;
        }
    } catch (e) {
    }
}