The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
-
Updated
Dec 18, 2024 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
A single-file tkinter-based Ollama GUI project with no external dependencies.
Chat with your pdf using your local LLM, OLLAMA client.
TalkNexus: Ollama Multi-Model Chatbot & RAG Interface
Ollama with Let's Encrypt Using Docker Compose
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
Streamlit Chatbot using Ollama Open Source LLMs
"A simple and lightweight client-server program for interfacing with local LLMs using ollama, and LLMs in groq using groq api."
A modern web interface for [Ollama](https://ollama.ai/), featuring a clean design and essential chat functionalities.
Add a description, image, and links to the ollama-chat topic page so that developers can more easily learn about it.
To associate your repository with the ollama-chat topic, visit your repo's landing page and select "manage topics."