Skip to content

Latest commit

 

History

History
60 lines (37 loc) · 1.75 KB

README.md

File metadata and controls

60 lines (37 loc) · 1.75 KB

LLM Apps using Streamlit and LangChain

Introduction

This repository showcases a suite of experimental LLM applications built using Streamlit.

Uses

  • Streamlit (UI)
  • LangChain
  • OpenAI API (gpt-3.5-turbo)

Run the App

Clone the Repository

$ git clone github.com/rexsimiloluwah/streamlit-llm-apps
$ cd streamlit-llm-apps

Install the dependencies

You can advisably create a virtual environment

$ pip install -r requirements.txt

Run the app

$ streamlit run src/main.py

# Using make
$ make run-app

Example Applications

1. Simple Document QA App

This application enables you to perform question-answering over your PDF document. It uses the RetrievalQA chain and the in-memory DocArray vector store provided by LangChain.

Simple Document QA App Screenshot

Simple Document QA App Example Screenshot

2. Web Page QA App

This application enables you to perform question-answering over content loaded from a web page. It similarly uses the RetrievalQA chain and the in-memory DocArray vector store provided by LangChain.

Web Page QA App Screenshot

Web Page QA App Example Screenshot

3. Document Chat App

This application enables you to chat over your PDF document. It uses the ConversationalRetrievalChain chain and the in-memory DocArray vector store provided by LangChain. The memory is managed externally.

Document Chat App Screenshot

Document Chat App Example Screenshot