The PDF AI Chat Assistant is an advanced tool built with Next.js, enabling users to interact with their PDF documents through natural language. This application is perfect for professionals, students, and researchers who need to query and analyze large documents quickly and accurately. The project requires Ollama to be installed locally, utilizing the LLaMA 3.1 model for processing.
- Interactive PDF Querying: Engage in real-time conversations with your PDFs, asking questions and receiving precise answers.
- Built with Next.js: Leverages the power of Next.js for a fast, scalable, and user-friendly interface.
- Local Ollama Integration: Uses the LLaMA 3.1 model through a locally installed Ollama setup for enhanced document analysis.
- Multi-language Support: Capable of handling PDFs in various languages, making it versatile for global use.
- Node.js 14.x or later
- NPM or Yarn
- Ollama installed locally
- LLaMA 3.1 model set up in Ollama
-
Clone the repository:
git clone https://github.com/bangadam/pdf-ai-chat-assistant.git cd pdf-ai-chat-assistant
-
Install the dependencies:
npm install # or yarn install
-
Set up Ollama:
- Install Ollama locally following the instructions on the Ollama website.
- Download and set up the LLaMA 3.1 model in your local Ollama installation.
-
Run the development server:
npm run dev # or yarn dev
Open http://localhost:3000 with your browser to see the result.
After setting up the application, you can upload a PDF and start interacting with it via the chat interface. The AI, powered by the LLaMA 3.1 model, will respond to your queries based on the content of the document, providing insights and extracting relevant information.
To deploy the application, you can use platforms like Vercel, which seamlessly integrates with Next.js.
-
Ollama Setup: Ensure that Ollama is installed and the LLaMA 3.1 model is available on the deployment environment.
We welcome contributions! Please read our Contributing Guide for more information on how to get involved.
This project is licensed under the MIT License - see the LICENSE file for details.
Special thanks to the Next.js and Ollama communities for providing the tools and frameworks that make this project possible.
For any inquiries or support, please open an issue on GitHub or contact us via email.