• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Why is manual link folder replacement faster than script in InDesign?

Contributor ,
Apr 04, 2024 Apr 04, 2024

Copy link to clipboard

Copied

Hello everyone,

I have a question to ask. When I use this script to change the target folder of links, if there are many linked files, the entire script takes a long time to complete. However, if I rename the source directory first and then use InDesign's Links panel to replace the target folder for finding missing files, this method is much faster, only a fraction of the time it takes for the script to run.

What could be the reason for this significant difference in speed?

Is there any way to optimize this script?
Thank you all.

 

var originalFolderPath = "F:/source";
var targetFolderPath = "e:/target";

var doc = app.activeDocument;
var links = doc.links;
var updatedLinksCount = 0;
for (var i = 0; i < links.length; i++) {
    var link = links[i];
    var originalLinkPath = link.filePath;
    if (originalLinkPath.indexOf(originalFolderPath) === 0 || originalLinkPath.indexOf(originalFolderPath.replace(/\//g, "\\")) === 0) {
        var originalFileName = originalLinkPath.substring(originalLinkPath.lastIndexOf('/') + 1);
        if (originalFileName == originalLinkPath) {
            originalFileName = originalLinkPath.substring(originalLinkPath.lastIndexOf('\\') + 1);
        }    
        var targetFilePath = targetFolderPath + "/" + originalFileName; 
        //$.writeln("Target File Path: " + targetFilePath);
        var targetFile = new File(targetFilePath);
        if (targetFile.exists) {
            link.relink(targetFile);
            updatedLinksCount++;
        }
    }
}

if (updatedLinksCount > 0) {
    alert("Changed: " + updatedLinksCount + " links to files with the same names in the target folder!");
} else {
    alert("No links to update found!");
}

 

TOPICS
Bug , Scripting

Views

1.6K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Community Expert , Apr 05, 2024 Apr 05, 2024

Maybe try upating the links by page so you avoid such large arrays. This was 6x faster for me on a doc with 40 links per page:

 

Screen Shot 4.png

vs. all doc links

 

Screen Shot 3.png

 

var st = new Date().getTime ()
var targetFolderPath = "~/Desktop/target/"

var p = app.activeDocument.pages.everyItem().getElements()
var updatedLinksCount = 0;
var pg, targetFile;
for (var i = 0; i < p.length; i++){
    pg = p[i].allGraphics
    for (var j = 0; j < pg.length; j++){
        if (pg[j].itemLink != null) {
            targetFile =
...

Votes

Translate

Translate
Community Expert ,
Apr 04, 2024 Apr 04, 2024

Copy link to clipboard

Copied

This script is not very optimal... It uses indexOf() and GREP on each link - string operations - instead of just replace in try...catch.

 

If what you are looking for and what you want to change it with are constants - there is no need to check each link.

 

If you were relinking by some unspecified date or version or something variable - then yes, GREP would be needed. But in this case - it's a waste of resurces.

 

And instead of checking if destination file exists - you could also use "brute force" and try...catch - if destination file exists - will be relinked - if not - nothing bad will happen.

 

And another reason why InDesign is doing it faster - because is much "closer" to the system's core - JS needs to be translated/interpreted and then executed - so extra time is needed.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 04, 2024 Apr 04, 2024

Copy link to clipboard

Copied

@Robert at ID-Tasker Thank you for your response. Actually, there are only dozens of linked files placed in my document, but these linked files are reused across hundreds of pages, eventually reaching thousands. This slows down efficiency when changing link paths. If possible, please provide specific modifications to the code. Thank you.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

I have tried using try...catch in the script and found that it does indeed improve execution efficiency slightly. However, when processing too many pages, the speed is still incredibly slow. The real issue might be the large number of pages and links, which consumes too much memory and resources. Therefore, I am now attempting to split the document, processing a portion of the links each time, and then concatenating them together.

 

if (targetFolderPath) {
    try {
        var originalFileName = getOriginalFileName(originalLinkPath);
        var targetFilePath = targetFolderPath + "/" + originalFileName;
        var targetFile = new File(targetFilePath);
        if (targetFile.exists) {
link.relink(targetFile);
updatedLinksCount++;
        }
    } catch (e) {
    }
}

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

@Aprking

 

One more thing - you should iterate your loop backwards.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

@Robert at ID-Tasker Perhaps this is the reason for the low efficiency of my script when encountering documents with too many pages. I am a novice, and I'm afraid I cannot immediately understand your answer. However, I will continue learning. Thank you for your response.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

quote

@Robert at ID-Tasker Perhaps this is the reason for the low efficiency of my script when encountering documents with too many pages. I am a novice, and I'm afraid I cannot immediately understand your answer. However, I will continue learning. Thank you for your response.


By @Aprking

 

When you relink files - InDesign is destroying original link and adds new one at the end - so if you go 1->N - you might skip next on the list - and process new ones again. But I'm not JS guy so I might be wrong on both or the last one.

 

If you work with arrays - as @Peter Kahrel suggested - it shouldn't happen.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

@Peter Kahrel , @rob day - would you like to pitch in?

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

The first step to speed up the script is to change this line:

var links = doc.links;

to

var links = doc.links.everyItem().getElements();

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

@Peter Kahrel Thank you for your participation and response.
After actual testing:

This code when testing a 100-page document:
var links = doc.links.everyItem().getElements(); // Execution Time: 90.085s
This code when testing a 100-page document:
var links = doc.links; // Execution Time: 66.253s
This code when testing a 360-page document:
var links = doc.links; // Execution Time: 5956.596s

The difference is astonishing; too many pages and links result in disproportionate time expenses.
Note: The content and number of links are almost the same on each page.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

And some changes in the code to reduce the number of actions. If this code doesn't execute faster, at least it reads quicker 🙂

 

 

// Add the slash to the path names

var originalFolderPath = "F:/source/";
var targetFolderPath = "f:/target/";

// Work on arrays as much as possible, 
// collections process much slower

var links = app.activeDocument.links.everyItem().getElements();

var updatedLinksCount = 0;
var targetFile;
var regex = /.+[\\\/]/;  // Precompile regular expressions
for (var i = 0; i < links.length; i++) {
  targetFile = File (targetFolderPath + links[i].filePath.replace(regex,''));
  if (targetFile.exists) {
    links[i].relink(targetFile);
    updatedLinksCount++;
  }
}

 

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

After actual execution, the original code's Execution Time is 32.037s, while this code's Execution Time is 29.917s. However, there's an issue: this code replaces links not belonging to originalFolderPath with targetFolderPath. This is a bug, and the fast speed is probably due to the lack of filtering based on originalFolderPath.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

Hi @Aprking out of curiosity, are you interested in trying with `reinitLink` method? It might be faster as it doesn't care if the file exists. I've also done a few other things that *might* speed things up a little. Note that you need to work with URIs, not filePaths. You can use the writeln to print out the link URIs so you don't have to guess. (I chose use URIs because I'm on MacOS and URIs were easier than to write cross-platform path handling in a hurry. All that being said, this is a real quick sketch, so might not be what you want.)

- Mark

 

function main() {

    var originalFolderURI = "file:/Users/mark/Library/CloudStorage/Dropbox/TEST/graphics";
    var targetFolderURI = "file:/Users/mark/Desktop/graphics";

    var doc = app.activeDocument;
    var everyLink = doc.links.everyItem()
    var links = everyLink.getElements();
    var linkURIs = everyLink.linkResourceURI;
    var linkNames = everyLink.name;

    var updatedLinksCount = 0;

    for (var i = 0, link, originalLinkURI, originalFileName, targetFile; i < links.length; i++) {

        link = links[i];
        originalLinkURI = linkURIs[i];
        originalFileName = linkNames[i];

        // $.writeln('originalLinkPath = ' + originalLinkURI);

        if (0 !== originalLinkURI.indexOf(originalFolderURI))
            // no match
            continue;

        link.reinitLink(originalLinkURI.replace(originalFolderURI,targetFolderURI));

        updatedLinksCount++;

    }

    if (updatedLinksCount > 0) {
        alert("Changed: " + updatedLinksCount + " links to files with the same names in the target folder!");
    } else {
        alert("No links to updated!");
    }

};

app.doScript(main, ScriptLanguage.JAVASCRIPT, undefined, UndoModes.ENTIRE_SCRIPT, "Update Links");

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

@Aprking 

 

Based on the @Peter Kahrel code - that's what I was suggesting:

 

 

// Add the slash to the path names

var originalFolderPath = "F:/source/";
var targetFolderPath = "f:/target/";

// Work on arrays as much as possible, 
// collections process much slower

var links = app.activeDocument.links.everyItem().getElements();

for (var i = links.length-1; i > 0; i--) 
{
  try
  {
     links[i].relink(links[i].filePath.replace(originalFolderPath, targetFolderPath));
  } catch (e) 
  {};
}

 

 

You won't have stats about how many has been changed - but who cares?

 

And URI idea and reInitLink would probably be even quicker.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

Hi Peter, I doubt this affects speed, but do you need the regex or the originalFolderPath variable? This works for me on OSX:

 

var targetFolderPath = "~/Desktop/target/" 
var links = app.activeDocument.links.everyItem().getElements();
var updatedLinksCount = 0;
var targetFile;
for (var i = 0; i < links.length; i++) {
    targetFile = File (targetFolderPath +  links[i].name);
    if (targetFile.exists) {
        links[i].relink(targetFile);
        updatedLinksCount++;
    }
} 

 

Also, @Aprking did not mention where the targetFolderPath is—is it on the startup drive or a networked volume? If it’s on a network volume maybe that’s where things slow down?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

Rob -- As I understood it the graphics are in the source and the target folder. The links are to the source folder and need to be changed to the target folder. The regex I did was simply to strip the (source) folder from a link's full path name (rightly or wrongly).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

Yes, they both work for me—I wasn’t sure if the regex replace() function would impact speed?

 

I ran a test on 40 pages with 40 links with the target folder on my desktop, and the time was 8 sec.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

The same test with the target on a networked volume was 32secs

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

@rob day I received your help last year, and I'm glad to receive your response again today.

My document has about 2000 links when it's only 50 pages long. Currently, I've realized it's not a scripting issue, but rather a problem of facing a search volume of 1x40 when the document is only one page, and a search volume of 1x2000 when it's 50 pages. This is further compounded when dealing with 50 or even more pages, resulting in astronomically high search volumes that cause significant slowdowns. Therefore,

I'm trying to split the

var links = doc.links;

into several parts for processing.

Thank you all.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

Currently, I've realized it's not a scripting issue

 

Is your taget folder on the startup drive? I’m seeing 4x difference in speed between a networked volume and the startup drive.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

@rob day I think I still haven't made the issue clear. Please copy and paste the number of pages in the document you're testing to 300 pages and then test it again. See if the processing time increases linearly or geometrically. Thank you.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

@Aprking :

> What could be the reason for this significant difference in speed?

 

Scripts will always be slower for these kind of operations than native InDesign actions (as others also mentioned). To match the speed of InDesign you'll need to write a C++ plug-in (while, of course, all the great suggestions for script optimization you received here can help speed it up).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

I have tested it countless times. When the document is under 100 pages, the processing time still increases linearly, typically ranging from a few seconds to a few tens of seconds. However, when the page count exceeds 300 pages and the number of links exceeds 12,000, the processing time increases exponentially. It usually takes several tens of minutes or even over an hour. The script optimizations provided by everyone can only reduce the processing time by a few seconds. I still haven't found the reason for this. Nevertheless, I appreciate everyone's help.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

Please copy and paste the number of pages in the document you're testing to 300 pages and then test it again.

 

320 pages took 15sec with this code and the target folder on my startup’s desktop folder. I’m using this code:

 

var targetFolderPath = "~/Desktop/target/" 
var links = app.activeDocument.links.everyItem().getElements();
var updatedLinksCount = 0;
var targetFile;
for (var i = 0; i < links.length; i++) {
    targetFile = File (targetFolderPath +  links[i].name);
    if (targetFile.exists) {
        //links[i].parent.place(targetFile)
        links[i].relink(targetFile);
        updatedLinksCount++;
    }
}  

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Apr 05, 2024 Apr 05, 2024

Copy link to clipboard

Copied

Can you tell me how many links are in the documentation? I have over ten thousand links in my documentation. Thanks again for your help.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines