The bot can talk to you like a regular mattermost user. It's like having chat.openai.com built collaboratively built into Mattermost! But that's not all, you can also use it to generate images via Dall-E or diagram visualizations via a yFiles plugin!
Here's how to get the bot running - it's easy if you have a Docker host.
You need
- the Mattermost token for the bot user (
@chatgpt
by default) - the OpenAI API key
- a Docker server for continuously running the service, alternatively for testing, Node.js 16 is sufficient.
Andrew Zigler from Mattermost created a YouTube Video that quickly guides you through the setup.
If you want to learn more about how this plugin came to live, read the blog post at yWorks.com!
These are the available options, you can set them as environment variables when running the script or when running the docker image or when configuring your docker-compose file.
Name | Required | Example Value | Description |
---|---|---|---|
MATTERMOST_URL | yes | https://mattermost.server |
The URL to the server. This is used for connecting the bot to the Mattermost API |
MATTERMOST_TOKEN | yes | abababacdcdcd |
The authentication token from the logged in mattermost bot |
OPENAI_API_KEY | yes | sk-234234234234234234 |
The OpenAI API key to authenticate with OpenAI |
OPENAI_API_BASE | no | http://example.com:8080/v1 |
The address of an OpenAI compatible API. Overrides the default base path (https://api.openai.com ) |
OPENAI_MODEL_NAME | no | gpt-3.5-turbo |
The OpenAI language model to use, defaults to gpt-3.5-turbo |
OPENAI_MAX_TOKENS | no | 2000 |
The maximum number of tokens to pass to the OpenAI API, defaults to 2000 |
OPENAI_TEMPERATURE | no | 0.2 |
The sampling temperature to use, between 0 and 2, defaults to 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. |
YFILES_SERVER_URL | no | http://localhost:3835 |
The URL to the yFiles graph service for embedding auto-generated diagrams. |
NODE_EXTRA_CA_CERTS | no | /file/to/cert.crt |
a link to a certificate file to pass to node.js for authenticating self-signed certificates |
MATTERMOST_BOTNAME | no | "@chatgpt" |
the name of the bot user in Mattermost, defaults to '@chatgpt' |
PLUGINS | no | graph-plugin, image-plugin |
The enabled plugins of the bot. By default all plugins (grpah-plugin and image-plugin) are enabled. |
DEBUG_LEVEL | no | TRACE |
a debug level used for logging activity, defaults to INFO |
BOT_CONTEXT_MSG | no | 15 |
The number of previous messages which are appended to the conversation with ChatGPT, defaults to 100 |
BOT_INSTRUCTION | no | Act like Elon Musk |
Extra instruction to give your assistance. How should the assistant behave? |
Note The
YFILES_SERVER_URL
is used for automatically converting text information created by the bot into diagrams. This is currently in development. You can see it in action, here: LinkedIn Post If you are interested in getting your hands on the plugin, please contact yWorks!
Use the prebuilt image from ghcr.io/yguy/chatgpt-mattermost-bot
docker run -d --restart unless-stopped \
-e MATTERMOST_URL=https://mattermost.server \
-e MATTERMOST_TOKEN=abababacdcdcd \
-e OPENAI_API_KEY=234234234234234234 \
--name chatbot \
ghcr.io/yguy/chatgpt-mattermost-bot:latest
First step is to clone this repo.
git clone https://github.com/yGuy/chatgpt-mattermost-bot.git && cd chatgpt-mattermost-bot
For testing, you could now just run npm install
and npm run start
directly, but be sure to set the environment variables or pass them to the node process, first!
For production use, in order to create a service on a docker container that will always provide the service without you having to run it on your own machine, you can do the following:
Build the docker image from the Dockerfile:
docker build . -t yguy/chatgpt-mattermost-bot
Create and run a container from the image
docker run -d --restart unless-stopped \
-e MATTERMOST_URL=https://mattermost.server \
-e MATTERMOST_TOKEN=abababacdcdcd \
-e OPENAI_API_KEY=234234234234234234 \
--name chatbot \
yguy/chatgpt-mattermost-bot
If your Mattermost instance uses a TLS certificate signed by a private CA, you will need to provide the CA's public root to the container for validation.
If the root certificate is located at /absolutepath/to/certfile.crt
, then you
can mount that file into the container at a fixed position and specify the node environment variable accordingly:
docker run -d --restart unless-stopped \
-v /absolutepath/to/certfile.crt:/certs/certfile.crt \
-e NODE_EXTRA_CA_CERTS=/certs/certfile.crt \
-e MATTERMOST_URL=https://mattermost.server \
-e MATTERMOST_TOKEN=abababacdcdcd \
-e OPENAI_API_KEY=234234234234234234 \
--name chatbot \
yguy/chatgpt-mattermost-bot
Verify it's running
docker ps
Later, to stop the service
docker stop chatbot
If you want to run docker compose (maybe even merge it with your mattermost docker stack), you can use this
as a starting point: First adjust the environment variables in docker-compose.yml
.
MATTERMOST_URL: https://mattermost.server
MATTERMOST_TOKEN: abababacdcdcd
OPENAI_API_KEY: sk-234234234234234234
# Set this if using a custom username for the bot, default = @chatgpt
MATTERMOST_BOTNAME: "@chatgpt"
# Console logging output level, default = INFO
DEBUG_LEVEL: TRACE
# Node environment, default = production
NODE_ENV: production
If your Mattermost instance uses a TLS certificate signed by a private CA, you will need to provide the CA's public root to the container for validation.
If the root certificate is located at /absolutepath/to/certfile.crt
, then you
would merge the contents below into the docker-compose.yml
file:
services:
chatbot:
volumes:
- /absolutepath/to/certfile.crt:/certs/certfile.crt:ro
environment:
NODE_EXTRA_CA_CERTS: /certs/certfile.crt
When all configuration is complete, start the container service.
docker compose up -d
Verify it's running:
docker compose ps
To stop the container:
docker compose down
The chatgpt-mattermost-bot chart deploys a containerized chatgpt-mattermost-bot instance which will connect to a running mattermost container in the same kubernetes cluster. Chart uses 'mattermost-team-edition' and the 'mattermost' namespace by default. Uses environment variables MATTERMOST_TOKEN and OPENAI_API_KEY.
helm upgrade chatgpt-mattermost-bot ./helm/chatgpt-mattermost-bot \
--create-namespace \
--install \
-n mattermost \
-f ./helm/chatgpt-mattermost-bot/values.yaml \
--set config.MATTERMOST_TOKEN="$MATTERMOST_TOKEN" \
--set config.OPENAI_API_KEY="$OPENAI_API_KEY"
Here's an example chat that I just had with our bot:
@yguy:
@chatgpt - please write a short introductory paragraph for the README in the GitHub repo that hosts the sources for you, the friendly Mattermost bot. Tell the readers that you enjoy working for the nice people at yWorks and that everybody should install a Mattermost chatbot into their Mattermost chat system to benefit from a nice and helpful assistant like you are. I will include this in the README on the GitHub repo, so make this an engaging text! Thanks!
chatgpt (BOT):
Hello there! I'm @chatgpt, a friendly and reliable Mattermost bot designed to make your communication experience smoother and more efficient. I'm honored to be part of the team at yWorks, where I help employees get things done faster and with less hassle. With my advanced machine learning algorithms, I'm capable of understanding complex natural language queries and providing succinct and accurate responses in a matter of seconds. Whether you're a developer looking for information on your codebase or a manager seeking to streamline your team's workflows, I'm here to help. Installing a Mattermost chatbot like me is a must for any organization looking to boost its productivity and enjoy a more pleasant communication experience. So what are you waiting for? Let's chat!
I built this tool as a quick hack on a rainy Saturday afternoon, but we are using the bot in production in our Mattermost instance at our office at yworks and the bot has proved to be very helpful for many simple tasks. Give it a try and provide feedback if you like! It's really not very expensive: We had it running for about 30 users for two weeks and that cost us less than half a dollar for the ChatGPT service!
I will also accept helpful pull requests if you find an issue or have an idea for an improvement.
Last but not least, check out yWorks' fine diagramming SDKs for software developers yFiles and our free online graph and diagram editors!
This is under MIT license Copyright (c) 2023 Sebastian Mueller (yWorks) and Michael Haeglsperger (yWorks)