Reducing token cost by adding a canned response chatbot off a database that GPT populates #110
OsbornVentures
started this conversation in
Ideas
Replies: 2 comments
-
https://github.com/LAION-AI/Open-Assistant |
Beta Was this translation helpful? Give feedback.
0 replies
-
The model has been updated to ChatGPT now, I will keep an eye on Open-Assistant thank you |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Use this bot and build a canned response chatbot on top of it that uses a database for responses.
Design and implement two databases:
--A "canned" database for storing previous questions and answers that have been asked at least three times.
--A "live" database for storing current questions and answers that have not yet been canned.
--Check the "canned" database for a matching question.
--If a matching question is found, choose one of the previous responses at random (with a 1/3 chance for each) and send it to the Discord server.
--If a matching question is not found, submit the question to the GPT API and save the question and answer to the "live" database.
Implement an accuracy system using difflib. If a question is similar (e.g., 95% similar) to a previous canned question, choose the corresponding answer for the most similar question. This will be on the canned database.
Create a separate process that sorts through the "live" database and populates the "canned" database with any questions that have been asked at least three times.
As the "canned" database grows, consider lowering the temperature and adjust max_token values or potentially using a different model such as Curie to improve the chatbot's responses to the operators desire.
Overall, the goal of this chatbot add-on is to create a community database of questions and answers that can be used by future developers to reduce token costs by avoiding unnecessary API calls. The chatbot will save on token costs by using previously asked questions and answers from the "canned" database when possible, while still providing a sense of randomness and organic conversation by choosing responses at random. As the "canned" database grows, the chatbot's responses will become more cost-effective.
-gpt helped write this pitch
Beta Was this translation helpful? Give feedback.
All reactions