• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Notorious bsscftp.txt files - any way to get rid of them?

Explorer ,
Feb 23, 2016 Feb 23, 2016

Copy link to clipboard

Copied

Is there any way to prevent RoboHelp from putting bsscftp.txt files in all output folders when publishing? (Except not using the "Publish" button and just generate the output, which is not an option in my case.)

Although the file name indicates that bsscftp.txt should be related to FTP transfer only, RoboHelp ALWAYS creates the files, no matter which connection protocol you choose in the "Publish" settings (FTP / File System / HTTP / SharePoint). Is there any good reason for that? If not, I will file this as a bug on Adobe Bugbase.

My guess is that the files are used to check if a file already exists in the target folder, which can speed up the file transfer. But there's usually very little need for that when using File System transfer. I'd rather have a clean output directory than a marginal increase in speed that I don't need when transferring within my fast corporate network.

Views

510

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 23, 2016 Feb 23, 2016

Copy link to clipboard

Copied

Hi there

When you elect to use the Publish function, RoboHelp has to track the files to be published. Thus, it creates these files.

advAPOLOGIESance if this is too basic and you already know it.

When you choose to publish and it's the first time you have published, RoboHelp carefully traverses the output folder and dutifully copies all the files and folders inside to the publish destination. At that time, it makes a list of each file copied and it tracks certain attributes. Let's say it takes five minutes to do all the copying the first time through and you have perhaps 2,000 files that are published..

Now let's say you change only three  of the files. You generate again and click the Publish button. Where it took five minutes before, perhaps it takes 15 seconds now and it only publishes perhaps 30 files.

The reason the time is shorter is because only 30 files were actually changed. So it only needed to copy 30 files to the publish destination. The only way it could know that is by checking these files you are questioning and wanting to get rid of.

I'm guessing if you enabled the "Republish All" option there would be no need for them and you wouldn't see them. But that would eliminate much of the beauty of publishing. While it would still be an automated process, you would be back to waiting the full five minutes each time because RoboHelp would have to copy each and every file across whether it actually needed to be copied or not.

Cheers... Rick

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 23, 2016 Feb 23, 2016

Copy link to clipboard

Copied

Yes, that's what I thought. When you are publishing via FTP, it really does make sense to keep track of the files like this because otherwise, you'd have to fetch the directory content each time etc. But for transfer via local or network file system, there are much easier and faster ways to compare source and destination folders. You don't need extra files for this. (Just like e.g. Windows Explorer doesn't need extra files to merge folders.)

The whole implementation seems to be based on the assumption that everyone is publishing their website via FTP, which has been the case in the nineties, but times have changed ...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 23, 2016 Feb 23, 2016

Copy link to clipboard

Copied

Well, I can say that in perhaps the last three years I encountered a situation with some folks that were publishing via the file system to a company intranet. When I was first consulted, each iteration took perhaps 20 minutes to publish. After coaxing them to turn off the "Republish All" function, each iteration took perhaps 20 seconds. They were amazed.

So I'm going to say that it's not only for FTP.

Are there better ways to handle it? Perhaps. But why would I care? That's something for the RoboHelp development team to deal with. I'm just happy it works well.

I'm curious as to why the existence of the files gives you grief.

Cheers... Rick

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 23, 2016 Feb 23, 2016

Copy link to clipboard

Copied

Thanks for your reply. Well, it isn't really causing me sleepless nights 🙂 Just a little annoying to have dozens of unnessesary files in my output. As the ouput will also be distributed as a ZIP file, our customers will have those files, too.

In your previous post, you wrote: "I'm guessing if you enabled the "Republish All" option there would be no need for them and you wouldn't see them." Unfortunately, that's not the case, RH always generates them. But that would be the perfect solution: If "Republish All" is enabled, overwrite all files and don't generate any bsscftp.txt files (= slower, but cleaner). If "Republish All" is disabled, check for existing files and generate bsscftp.txt files (faster, but "messier").

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Feb 23, 2016 Feb 23, 2016

Copy link to clipboard

Copied

If you’re zipping them up anyway, why not just either generate locally, zip, then extract to your network location OR generate to the network location directly (i.e. skip the Publish part entirely)?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Feb 23, 2016 Feb 23, 2016

Copy link to clipboard

Copied

Because there are five technical writers in my team, all working with the same project in RoboHelp. "Generate" generates the local version for each writer (to test the output individually), "Publish" merges it all together on a network folder.

That said, we might as well use two SSL outputs in our project, one named "Local", one named "Network", and skip the publish part on both. Maybe we should do that. But since there is a "File System" option under "Publish", it seemed just right.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Feb 23, 2016 Feb 23, 2016

Copy link to clipboard

Copied

LATEST

Hmmm, you might want to re-think that idea.

Here's why. When you generate, the first step in the process is to delete the content that exists in the output location. Then the process proceeds with generating the content. So for a brief period of time you might "pull the rug" from beneath anyone using the content.

That's why publish is helpful. You generate to one location and publish to another.

Cheers... Rick

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Resources
RoboHelp Documentation
Download Adobe RoboHelp