This is a fork of the original ResearchGPT. The current version modifies several settings from the original:
- Use
GPT-3.5-turbo
instead ofGPT-3
. - Add Chinese language support.
- Improved results (Powered by llama-index) for research papers.
- Modification of the frontend.
I use this repository to study OpenAI's API, Microsoft's Azure, GitHub Pages, CI/CD and Vue.js Frontend Framework. The partial original README is below.
This is a flask app provides an interface to enable a conversation with a research paper. You can enter a link to a pdf hosted online or upload your own pdf. The app will then extract the text from the pdf, create embeddings from the text and use them with the openai api to generate a response to a question you ask. It will also return a source for the part of the text it used to generate the response and the page number.
This web app supports query in multiple languages and research papers in English/Chinese. Here are the examples of queries:
To run this app locally, you need to install the dependencies and build the frontend. For starters, you need to install python3
and yarn
by following the instructions here and here. The app is tested on Ubuntu and Arch Linux.
You can install the dependencies with:
git clone https://github.com/MrPeterJin/researchgpt
cd researchgpt
pip install -r requirements.txt
cd frontend
yarn
yarn build
Also, you need to have an OpenAI API key and set it as the environment variable 'OPENAI_API_KEY'.
The local version would save the pdf embeddings in the embedding
folder to save OpenAI API usage. You can run the app locally with:
export OPENAI_API_KEY=YOUR_API_KEY
python local.py
For convenience, the local version stores the embeddings in the embedding
folder in order to save the cost and time.
And then open http://127.0.0.1:8080/ in your browser.
The online version does not save any data. Follow the instructions here. Once you have the azure cli set up with az login
, you can deploy with streamed logs:
az webapp up --runtime PYTHON:3.9 --sku B1 --logs
The Microsoft Azure's services would identify app.py
as the entry point of the app.
Click the button below and input your OpenAI API key to deploy the app on Railway. Railway is a free hosting platform for web apps.
The app has been containerized to simplify deployment.
To build:
docker build . -t researchgpt
To run:
docker run -p 8080:8080 -e OPENAI_API_KEY=your_api_key researchgpt
And then open http://127.0.0.1:8080/ in your browser.
Due to the PDF to text conversion and embedding construction technique, the web app is limited to handle detailed query. Also, when a paper has distinguished pattern from the ordinary paper, this application also may not able to handle it. We are continuing working on improving the app to give better respond. At this time, you are encouraged to try this app on papers less than 20 pages and give us feedback. The app does not have the limit in page number, though.
Also, current version of the app is not able to handle the query with simultaneous requests, i.e., the web page cannot serve two or more people at the same time. This is out of the scope of my knowledge and I am looking for help.