DocQA 🤖 is a web application built using Streamlit 🔥 and the LangChain 🦜🔗 framework, allowing users to leverage the power of LLMs for Generative Question Answering. 🌟
Read More Here 👉 https://ai.plainenglish.io/️-langchain-streamlit-llama-bringing-conversational-ai-to-your-local-machine-a1736252b172
To run the LangChain web application locally, follow these steps:
Clone this repository 🔗
git clone https://github.com/afaqueumer/DocQA.git
Create Virtual Environment and Install the required dependencies ⚙️
Run ➡️ setup_env.bat
Launch Streamlit App 🚀
Run ➡️ run_app.bat
Once you have the Streamlit web application up and running, you can perform the following steps:
- Upload the Text File.
- Once the Text File is loaded as the Vector Store Database it will show a success alert "Document is Loaded".
- Insert the question in "Ask" textbox and submit your question for LLM to generate the answer.
Contributions to this app are welcome! If you have any ideas, suggestions, or bug fixes, please feel free to open an issue or submit a pull request. We appreciate your contributions.
This project is licensed under the MIT License.
🎉 Thank you 🤗 Happy question answering! 🌟