Skip to content

justushar/DocOllama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DocOllama - Private Chat with Your Documents

Completely local RAG with chat UI

Installation

Clone the repo:

git clone [email protected]:justushar/DocOllama.git
cd DocOllama

Install the dependencies:

pip install -r requirements.txt

Fetch your LLM (llama3.1 by default):

ollama pull llama3.1:8b

Run the Ollama server

ollama serve

Start DocOllama:

streamlit run app.py

Architecture

Ingestor

Extracts text from PDF documents and creates chunks (using semantic and character splitter) that are stored in a vector databse

Retriever

Given a query, searches for similar documents, reranks the result and applies LLM chain filter before returning the response.

QA Chain

Combines the LLM with the retriever to answer a given user question

Tech Stack

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages