A quick question to the community: When I generate our help files and we post them, they are available on the www, and catalogued by Google's search engine, etc. My understanding is there is is way to prevent (or at least request that) Google not catelogue these files so they will not appear in open Google searches.
Is there some way to flag a project at export to include metadata that prevents search engines from cateloging the content?
My thanks in advance for any thoughts!
Thank you Jeff, for your reply. 🙂 I hadn't looked at that yet, but it does mention the Robots.txt file, which has come up. The concern was that each time the project regenerates and we stick it on our server, the robots.txt would be overwritten. I've been asked to see if there's a way for RoboHELP to include this information on generation so we don't lose it. The meta robots tag sounds like it could be a solution, but again, it would need to somehow be included in every html file in the project, and part of what I was trying to determine is if there's a way for RH to add that information at generation.
When you generate everything in the target folder is deleted. When you publish nothing is deleted. You only need to set things up once.
There's something on my site about it.
It's a little different how we publish... I generate to the target folder and that folder then syncs to our servers through SVN, so I'm not exactly sure what does and does not get eliminated when it comes out the other side of the process. The best solutoin for me would be to have the needed metadata be part and parcel of the generation process, but if that's not easily acheivable, then I'd need to go back to dev and see what they can do. Thanks!
Pretty sure the devs can come up with something similar to how RoboHelp works.