This project is a chatbot built using the Llama 2 model via the mlc_chat
library. It fetches messages from a server, generates responses, and posts the responses back to the server.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
- Python 3.6 or higher
requests
librarypsutil
librarydifflib
librarymlc_chat
library
- Clone the repository
- Install the required libraries using pip:
pip install -r requirements.txt
To run the chatbot, execute the sheepGPT.py
script:
python sheepGPT.py
The chatbot will start fetching messages from the server, generate responses, and post the responses back to the server.
- Your Name
This project is licensed under the MIT License - see the LICENSE.md file for details
- The
mlc_chat
library for providing the chat module - The Llama 2 model