You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Motivation - We could send the current conversation path to the llm and with the answer provided by the bot model to generate a personalized response for the user. Every user could have a personalized prompt.
Specification - Store the conversation path, currently we only store the state in the path, right? Then we need to talk with the llm (calling the API + set an OpenAI token). If a token and prompt is provided the bot should use this and replace the current response with the generated one.
Finalised state - The bot can use the OpenAI API and talk with an llm to enhance the conversation.
The text was updated successfully, but these errors were encountered:
Right now, a OpenAI request is specified in the bot model as a Bot Action. For Incoming Messages that would return a simple text response, to enhance the response, in the bot model we would just need to have the Incoming Message use a Bot Action with OpenAI parameters, and in the backend manager service, we would hold back the bot's response and pass it as an "example assistant" message along with the current conversation path to OpenAI. We would then trigger the chat with the response generated by OpenAI, essentially replacing the bot's response.
The issue with the above approach would be, when in the bot model, an Incoming Messsage is already using another service in a Bot Action to generate the response. It seems like an Incoming Message is meant to only have one trigger function/Bot Action. Another issue mentioned would be the redundant uses edges if we want to apply this enhancement to every Incoming Message as shown in the sample Bot Model image above.
Perhaps for this particular enhancement, we could add a toggle field to the Messenger class that indicates whether or not all Incoming Messages should be personalized with openAI. There would be no need to make a Bot Action node in the Bot Model and connect Incoming Messages to it.
Or a toggle field for individual Incoming Messages?
We would still be able make OpenAI requests using the Bot Action in other enhancements/use cases. Or if we only want to enhance one incoming Message this could still be done.
The text was updated successfully, but these errors were encountered: