Terminal UI to chat with large language models (LLM) using different model backends, and integrations with your favourite editors!
Oatmeal is a terminal UI chat application that speaks with LLMs, complete with slash commands and fancy chat bubbles. It features agnostic backends to allow switching between the powerhouse of ChatGPT, or keeping things private with Ollama. While Oatmeal works great as a stand alone terminal application, it works even better paired with an editor like Neovim!
See it in action with Neovim (click to restart):
Note: This project is still quite new, and LLM's can return unexpected answers the UI isn't prepped for. There's likely a few bugs hidden somewhere.
brew install dustinblackman/tap/oatmeal
Note: This method may have outdated releases.
curl -s https://apt.dustinblackman.com/KEY.gpg | apt-key add -
curl -s https://apt.dustinblackman.com/dustinblackman.list > /etc/apt/sources.list.d/dustinblackman.list
sudo apt-get update
sudo apt-get install oatmeal
Note: This method may have outdated releases.
dnf config-manager --add-repo https://yum.dustinblackman.com/config.repo
dnf install oatmeal
nix-env -f '<nixpkgs>' -iA nur.repos.dustinblackman.oatmeal
yay -S oatmeal-bin
arch=$(uname -a | grep -q aarch64 && echo 'arm64' || echo 'amd64')
curl -L -o oatmeal.apk "https://github.com/dustinblackman/oatmeal/releases/download/v0.13.0/oatmeal_0.13.0_linux_${arch}.apk"
apk add --allow-untrusted ./oatmeal.apk
Chocolatey
choco install oatmeal --version=0.13.0
Scoop
scoop bucket add dustinblackman https://github.com/dustinblackman/scoop-bucket.git
scoop install oatmeal
Winget
winget install -e --id dustinblackman.oatmeal
cargo install oatmeal --locked
docker run --rm -it ghcr.io/dustinblackman/oatmeal:latest
Download the pre-compiled binaries and packages from the releases page and copy to the desired location.
git clone https://github.com/dustinblackman/oatmeal.git
cd oatmeal
cargo build --release
mv ./target/release/oatmeal /usr/local/bin/
The following shows the available options to start a chat session. By default when running oatmeal
, Ollama is the selected backend, and the clipboard
integration for an editor.
See oatmeal --help
, /help
in chat, or the output below to get all the details.
Terminal UI to chat with large language models (LLM) using different model backends, and direct integrations with your favourite editors!
Version: 0.13.0
Commit: v0.13.0
Usage: oatmeal [OPTIONS] [COMMAND]
Commands:
chat Start a new chat session.
completions Generates shell completions.
config Configuration file options.
manpages Generates manpages and outputs to stdout.
sessions Manage past chat sessions.
help Print this message or the help of the given subcommand(s)
Options:
-b, --backend <backend>
The initial backend hosting a model to connect to. [default: ollama] [env: OATMEAL_BACKEND=] [possible values: langchain, ollama, openai, claude, gemini]
--backend-health-check-timeout <backend-health-check-timeout>
Time to wait in milliseconds before timing out when doing a healthcheck for a backend. [default: 1000] [env: OATMEAL_BACKEND_HEALTH_CHECK_TIMEOUT=]
-m, --model <model>
The initial model on a backend to consume. Defaults to the first model available from the backend if not set. [env: OATMEAL_MODEL=]
-c, --config-file <config-file>
Path to configuration file [default: ~/.config/oatmeal/config.toml] [env: OATMEAL_CONFIG_FILE=]
-e, --editor <editor>
The editor to integrate with. [default: clipboard] [env: OATMEAL_EDITOR=] [possible values: neovim, clipboard, none]
-t, --theme <theme>
Sets code syntax highlighting theme. [default: base16-onedark] [env: OATMEAL_THEME=] [possible values: base16-github, base16-monokai, base16-one-light, base16-onedark, base16-seti]
--theme-file <theme-file>
Absolute path to a TextMate tmTheme to use for code syntax highlighting. [env: OATMEAL_THEME_FILE=]
--lang-chain-url <lang-chain-url>
LangChain Serve API URL when using the LangChain backend. [default: http://localhost:8000] [env: OATMEAL_LANGCHAIN_URL=]
--ollama-url <ollama-url>
Ollama API URL when using the Ollama backend. [default: http://localhost:11434] [env: OATMEAL_OLLAMA_URL=]
--open-ai-url <open-ai-url>
OpenAI API URL when using the OpenAI backend. Can be swapped to a compatible proxy. [default: https://api.openai.com] [env: OATMEAL_OPENAI_URL=]
--open-ai-token <open-ai-token>
OpenAI API token when using the OpenAI backend. [env: OATMEAL_OPENAI_TOKEN=]
--claude-token <claude-token>
Anthropic's Claude API token when using the Claude backend. [env: OATMEAL_CLAUDE_TOKEN=]
--gemini-token <gemini-token>
Google Gemini API token when using the Gemini backend. [env: OATMEAL_GEMINI_TOKEN=]
-h, --help
Print help
-V, --version
Print version
CHAT COMMANDS:
- /modellist (/ml) - Lists all available models from the backend.
- /model (/model) [MODEL_NAME,MODEL_INDEX] - Sets the specified model as the active model. You can pass either the model name, or the index from `/modellist`.
- /append (/a) [CODE_BLOCK_NUMBER?] - Appends code blocks to an editor. See Code Actions for more details.
- /replace (/r) [CODE_BLOCK_NUMBER?] - Replaces selections with code blocks in an editor. See Code Actions for more details.
- /copy (/c) [CODE_BLOCK_NUMBER?] - Copies the entire chat history to your clipboard. When a `CODE_BLOCK_NUMBER` is used, only the specified copy blocks are copied to clipboard. See Code Actions for more details.
- /quit /exit (/q) - Exit Oatmeal.
- /help (/h) - Provides this help menu.
CHAT HOTKEYS:
- Up arrow - Scroll up.
- Down arrow - Scroll down.
- CTRL+U - Page up.
- CTRL+D - Page down.
- CTRL+C - Interrupt waiting for prompt response if in progress, otherwise exit.
- CTRL+O - Insert a line break at the cursor position.
- CTRL+R - Resubmit your last message to the backend.
CHAT CODE ACTIONS:
When working with models that provide code, and using an editor integration, Oatmeal has the capabilities to read selected code from an editor, and submit model provided code back in to an editor. Each code block provided by a model is indexed with a (NUMBER) at the beginning of the block to make it easily identifiable.
- /append (/a) [CODE_BLOCK_NUMBER?] will append one-to-many model provided code blocks to the open file in your editor.
- /replace (/r) [CODE_BLOCK_NUMBER?] - will replace selected code in your editor with one-to-many model provided code blocks.
- /copy (/c) [CODE_BLOCK_NUMBER?] - Copies the entire chat history to your clipboard. When a `CODE_BLOCK_NUMBER` is used it will append one-to-many model provided code blocks to your clipboard, no matter the editor integration.
The `CODE_BLOCK_NUMBER` allows you to select several code blocks to send back to your editor at once. The parameter can be set as follows:
- `1` - Selects the first code block
- `1,3,5` - Selects code blocks 1, 3, and 5.
- `2..5`- Selects an inclusive range of code blocks between 2 and 5.
- None - Selects the last provided code block.
On top of being configurable with command flags and environment variables, Oatmeal is also manageable with a
configuration file such as this example. You can run oatmeal config create
to initialize for
the first time.
Configuration file options.
Usage: oatmeal config [OPTIONS] [COMMAND]
Commands:
create Saves the default config file to the configuration file path. This command will fail if the file exists already.
default Outputs the default configuration file to stdout.
path Returns the default path for the configuration file.
help Print this message or the help of the given subcommand(s)
The following model backends are supported:
- OpenAI (Or any compatible proxy/API)
- Ollama
- LangChain/LangServe (Experimental)
- Claude (Experimental)
- Gemini (Experimental)
The following editors are currently supported. The clipboard
editor is a special case where any copy or accept commands
are simply copied to your clipboard. This is the default behaviour. Hit any of the links below for more details on how
to use!
- Clipboard (Default)
- None (Disables all editor functionality)
- Neovim
A handful of themes are embedded in the application for code syntax highlighting, defaulting to OneDark. If none suits your needs, Oatmeal supports any Sublime Text/Text Mate
.tmTheme
file with the theme-file
configuration option. base16-textmate has plenty to pick from!
Oatmeal persists all chat sessions with your models, allowing you to go back and review an old conversation, or pick up from where you left off!
Manage past chat sessions.
Usage: oatmeal sessions [OPTIONS] [COMMAND]
Commands:
dir Print the sessions cache directory path.
list List all previous sessions with their ids and models.
open Open a previous session by ID. Omit passing any session ID to load an interactive selection.
delete Delete one or all sessions.
help Print this message or the help of the given subcommand(s)
Grepping through previous sessions isn't something built in to Oatmeal (yet). This bash function can get you there nicely using Ripgrep and FZF.
function oatmeal-sessions() {
(
cd "$(oatmeal sessions dir)"
id=$(rg --color always -n . | fzf --ansi | awk -F ':' '{print $1}' | head -n1 | awk -F '.' '{print $1}')
oatmeal sessions open --id "$id"
)
}
Or something a little more in depth (while hacky) that additionally uses yq and jq.
function oatmeal-sessions() {
(
cd "$(oatmeal sessions dir)"
id=$(
ls | \
(while read f; do echo "$(cat $f)\n---\n"; done;) | \
yq -p=yaml -o=json - 2> /dev/null | \
jq -s . | \
jq -rc '. |= sort_by(.timestamp) | .[] | "\(.id):\(.timestamp):\(.state.backend_model):\(.state.editor_language):\(.state.messages[] | .text | tojson)"' | \
fzf --ansi | \
awk -F ':' '{print $1}' | \
head -n1 | \
awk -F '.' '{print $1}'
)
oatmeal sessions open --id "$id"
)
}
On each Oatmeal release there is a separate download to help in reporting issues to really drill down in to what the problem is! If you've run in to a problem, I'd really help appreciate solving it.
- Head over to releases and download the DEBUG package for the latest release of Oatmeal.
- Extract the contents of the archive, and
cd
in your terminal inside the archive. - Run your command with the arguments provided in the error message prefixing with
RUST_BACKTRACE=1 ./oatmeal **ARGS-HERE**
- Copy/paste the output and open an issue. Include any screenshots you believe will be helpful!
Oatmeal comes with a ready made DevContainer with all the magic needed to work on the project. However if you wish to develop fully local, the following will get you set up with all the necessary tooling.
cargo install cargo-run-bin
git clone https://github.com/dustinblackman/oatmeal.git
cd oatmeal
cargo cmd setup
Each backend implements the Backend trait in its own infrastructure file. The trait has documentation on what is expected of each method. You can checkout Ollama as an example.
The following steps should be completed to add a backend:
- Implement trait for new backend.
- Update the BackendName enum with your new Backend name.
- Update the BackendManager to provide your new backend.
- Write tests
Each editor implements the Editor trait in its own infrastructure file. The trait has documentation on what is expected of each method. You can checkout Neovim as an example.
The following steps should be completed to add an editor:
- Implement trait for new editor.
- Update the EditorName enum with your new Editor name.
- Update the EditorManager to provide your new editor.
- Write tests
Syntax highlighting language selection is a tad manual where several languages must be curated and then added to
assets.toml
.
- Google to find a
.sublime-syntax
project on GitHub for your language. bat has many! - Update
assets.toml
to include the new repo. Make sure to include the license in the files array. You can leavenix-hash
as an empty string, and it'll be updated by a maintainer later. Or if you have docker installed, you can runcargo xtask hash-assets
. rm -rf .caches && cargo build
- Test to see highlighting works.
I was eating a bowl of oatmeal when I wrote the first commit 🤷. (They don't let me name things at work anymore...)