A Gradio App for Retrieval-Augmented-Generation on PDFs
-
Updated
Jul 10, 2024 - Python
A Gradio App for Retrieval-Augmented-Generation on PDFs
Huginn Hears is a local app that transcribes and summarizes your meetings in Norwegian and English, using state-of-the-art models and open-source libraries. No cloud needed, run everything offline.
Repository for the Cybersecurity-M project course of professor M. Colajanni
MINeD Hackathon 2024 - Project
Genshin Impact Character Chat Models tuned by Lora on LLM
"Interact with your documents using the power of GPT, 100% privately, no data leaks" - This is a customized Windows RAG client based on Zylon's open source privateGPT.
Serving open source models of your choice in as a docker container using llama-cpp-python's OpenAI compatible server
This is a Christian Apologetic chatbot prototype.
A Genshin Impact Question Answer Project supported by Qwen1.5-14B-Chat
llama-cpp-python(llama.cpp)で実行するGGUF形式のLLM用の簡易Webインタフェースです。
Llama.cpp é uma biblioteca desenvolvida em C++ para a implementação eficiente de grandes modelos de linguagem, como o LLaMA da Meta. Otimizada para rodar em diversas plataformas, incluindo dispositivos com recursos limitados, oferece performance, velocidade de inferência e uso eficiente da memória, essenciais para a execução de grandes. modelos
A custom framework for easy use of LLMs, VLMs, etc. supporting various modes and settings via web-ui
UnOfficial Gradio Repo for ICML 2024 paper "Executable Code Actions Elicit Better LLM Agents" by Xingyao Wang, Yangyi Chen, Lifan Yuan, Yizhe Zhang, Yunzhu Li, Hao Peng, Heng Ji.
Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent
YouTube API implementation with Meta's Llama 2 to analyze comments and sentiments
A financial chatbot powered by an LLM and retrieval-augmented generation.
A quick and optimized solution to manage llama based gguf quantized models, download gguf files, retreive messege formatting, add more models from hf repos and more. It's super easy to use and comes prepacked with best preconfigured open source models: dolphin phi-2 2.7b, mistral 7b v0.2, mixtral 8x7b v0.1, solar 10.7b and zephyr 3b
AgentX is an Open-source library that help people use LLMs on their own computers or help them to serve LLMs as easy as possible that support multi-backends like PyTorch, llama.cpp, Ollama and EasyDeL
Add a description, image, and links to the llama-cpp-python topic page so that developers can more easily learn about it.
To associate your repository with the llama-cpp-python topic, visit your repo's landing page and select "manage topics."