This repository provides a Proof of Concept (POC) for integrating LlamaIndex with Azure OpenAI services.
- Python 3.12.3
-
Check Python Version
Ensure you have Python 3.12.3 installed:
python --version
-
Create a Virtual Environment
Create a virtual environment to manage dependencies:
virtualenv venv
-
Activate the Virtual Environment
Activate the virtual environment:
source venv/bin/activate
-
Install Required Packages
Install the required packages from
requirements.txt
:pip install -r requirements.txt
-
Run the Application
Run the main application script:
uvicorn app.main:app --reload
-
Freeze Dependencies
Freeze the current state of dependencies to
requirements.txt
:pip freeze > requirements.txt
-
Run Backend
uvicorn app.main:app
-
Streamlit frontend
streamlit run frontend/app.py
Here is a screenshot of the application in action:
For more information on how to use LlamaIndex with Azure OpenAI, please refer to the LlamaIndex documentation.
- Make sure to customize
main.py
as per your POC requirements. - Update
requirements.txt
as necessary when adding or updating dependencies.