LLM chatbot intergration with RoboHelp Frameless output
Hi,
We are currently looking to implement a chatbot in our help system. I can see there is some basic chatbot functionality out of the box with RoboHelp - using micro content for prescribed questions and answers, however, this does not suit our needs as we want a smarter chatbot that can search for the relevant information in our help system and present it to users, regardless of what they type (LLM)
Currently, we are looking to develop an LLM ourselves and apply this to our generated HTML5 (Frameless) output from RoboHelp and have encountered a couple of issues.
1) Web scraping only works up to the first subheading in our content - and so only the top paragraph is searched/returned.
2) We see a 403 - forbidden error on the first landing page (index) which prevents it from being scraped.
Is there anything particular to the RoboHelp output that could explain these issues? Or anyone who has tried to do something similar who has encountered similar issues? We have successfully managed to take this approach for other content generated by a different program (also HTML output) in a directory in the same location as the RoboHelp generated content we're trying to implement this on.
Any advice on this would be greatly appreciated.
Thanks
