This project is a Chrome extension that integrates a Large Language Model (LLM) to assist users by processing web page content and responding to user queries.
The front-end of the Chrome extension is built using React. It provides a user-friendly interface that allows users to interact with the LLM based on the content of the current web page.
- Captures page content excluding the extension's own interface.
- Sends user queries along with the page content to the back-end.
- Displays responses from the LLM in real-time.
-
Clone the repository:
git clone https://github.com/kongchenglc/LLamaChromeSidebar.git cd LLamaChromeSidebar
-
Install dependencies:
yarn
-
Build the project:
yarn run build
-
Load the extension in Chrome:
- Open Chrome and go to
chrome://extensions/
. - Enable "Developer mode."
- Click on "Load unpacked" and select the
dist
directory of your project.
- Open Chrome and go to
- Click on the extension icon that is at the right bottom corner of the window to open the sidebar.
- The extension automatically captures the content of the page.
- Enter your query and press enter to get a response from the LLM.
The back-end is built using Koa.js and handles requests from the front-end to communicate with the LLM.
- Processes incoming requests from the front-end.
- Streams responses from the LLM back to the front-end in real-time.
- Handles CORS for requests coming from the extension.
-
Navigate to the back-end directory:
git clone https://github.com/kongchenglc/LLamaChromeSidebarBackend.git cd LLamaChromeSidebarBackend
-
Install dependencies:
npm install
-
Set up environment variables:
- Create a
.env
file in the server directory and add your Hugging Face API token:HF_API_TOKEN=<your-token>
- Create a
-
Start the server:
npm start
POST /chat/
- Accepts JSON payloads with
pageContent
andmessage
. - Returns streamed responses from the LLM.
- Accepts JSON payloads with
To run the project, you need to set up both the front-end and back-end as described above. Make sure the back-end server is running while using the Chrome extension.
- Open a webpage in Chrome.
- Click on the extension icon to activate the sidebar.
- Enter your question or request and press enter.
- Receive a response from the LLM based on the content of the page.
This project is licensed under the MIT License.