When you elect to use the Publish function, RoboHelp has to track the files to be published. Thus, it creates these files.
advAPOLOGIESance if this is too basic and you already know it.
When you choose to publish and it's the first time you have published, RoboHelp carefully traverses the output folder and dutifully copies all the files and folders inside to the publish destination. At that time, it makes a list of each file copied and it tracks certain attributes. Let's say it takes five minutes to do all the copying the first time through and you have perhaps 2,000 files that are published..
Now let's say you change only three of the files. You generate again and click the Publish button. Where it took five minutes before, perhaps it takes 15 seconds now and it only publishes perhaps 30 files.
The reason the time is shorter is because only 30 files were actually changed. So it only needed to copy 30 files to the publish destination. The only way it could know that is by checking these files you are questioning and wanting to get rid of.
I'm guessing if you enabled the "Republish All" option there would be no need for them and you wouldn't see them. But that would eliminate much of the beauty of publishing. While it would still be an automated process, you would be back to waiting the full five minutes each time because RoboHelp would have to copy each and every file across whether it actually needed to be copied or not.
Yes, that's what I thought. When you are publishing via FTP, it really does make sense to keep track of the files like this because otherwise, you'd have to fetch the directory content each time etc. But for transfer via local or network file system, there are much easier and faster ways to compare source and destination folders. You don't need extra files for this. (Just like e.g. Windows Explorer doesn't need extra files to merge folders.)
The whole implementation seems to be based on the assumption that everyone is publishing their website via FTP, which has been the case in the nineties, but times have changed ...
Well, I can say that in perhaps the last three years I encountered a situation with some folks that were publishing via the file system to a company intranet. When I was first consulted, each iteration took perhaps 20 minutes to publish. After coaxing them to turn off the "Republish All" function, each iteration took perhaps 20 seconds. They were amazed.
So I'm going to say that it's not only for FTP.
Are there better ways to handle it? Perhaps. But why would I care? That's something for the RoboHelp development team to deal with. I'm just happy it works well.
I'm curious as to why the existence of the files gives you grief.
Thanks for your reply. Well, it isn't really causing me sleepless nights :-) Just a little annoying to have dozens of unnessesary files in my output. As the ouput will also be distributed as a ZIP file, our customers will have those files, too.
In your previous post, you wrote: "I'm guessing if you enabled the "Republish All" option there would be no need for them and you wouldn't see them." Unfortunately, that's not the case, RH always generates them. But that would be the perfect solution: If "Republish All" is enabled, overwrite all files and don't generate any bsscftp.txt files (= slower, but cleaner). If "Republish All" is disabled, check for existing files and generate bsscftp.txt files (faster, but "messier").
If you’re zipping them up anyway, why not just either generate locally, zip, then extract to your network location OR generate to the network location directly (i.e. skip the Publish part entirely)?
Because there are five technical writers in my team, all working with the same project in RoboHelp. "Generate" generates the local version for each writer (to test the output individually), "Publish" merges it all together on a network folder.
That said, we might as well use two SSL outputs in our project, one named "Local", one named "Network", and skip the publish part on both. Maybe we should do that. But since there is a "File System" option under "Publish", it seemed just right.
Hmmm, you might want to re-think that idea.
Here's why. When you generate, the first step in the process is to delete the content that exists in the output location. Then the process proceeds with generating the content. So for a brief period of time you might "pull the rug" from beneath anyone using the content.
That's why publish is helpful. You generate to one location and publish to another.