Consider creating a "chat bot" that uses a locally-hosted OpenAI Large Language Model (LLM). It's one of the more well-known ones and is open source. Then pretrain this generative response model (accessible only to sysadmin) by feeding it anticipated questions and the response you want the model to make to that query from existing Help Menu content and a sweep of the forum, as so many of the same questions are asked over and over again, e.g., shipping bands, VAT, and so on.
When a user affirms this answers their question, it can open the source of the answer in a new tab to keep the response on-screen to follow along with.
For those unsure what a "chatbot" is, imagine you are at any major site such as Amazon, and you click help - you tend to have a "live" web chat conversation with a computer which then forwards you to a live person (or in this case, to messaging sysadmin) if the user's question is not resolvable through that chat.
You could also set it to auto-incorporate Forum responses to questions from approved users, such as the sysadmin or prevet prior to posting from other trusted membership - so when you answer it on the forum once, it actually incorporates it into the pretraining as a source of truth.
I see a lot of value in exploring this for BO, specifically, as a first-level of help - it is in keeping with the more modern approach this site takes, and it would be a leg up on competitors who I doubt are doing this yet? Just an idea - these are surprisingly easy to learn and setup from what I understand! No hurt feelings here if folks (or sysadmin) thinks this is an awful idea! ;-)
Comments