how to configure two Azure API Keys/Endpoints one for embedding and one for GPT4 #999
Replies: 3 comments 3 replies
-
On the latest build (0.3.2) you should be able to set the environment variables split for the LLM and the embedding (see https://github.com/cpacker/MemGPT/blob/main/memgpt/cli/cli_config.py#L33-L44):
LMK if this doesn't work for you! |
Beta Was this translation helpful? Give feedback.
-
hi @cpacker I updated the code in my local and passed embedding API key in above as suggested. Post that I configured to use GPT4-32 model with another API KEY passed into AZURE_OPENAI_KEY. However, with this version of pymemgpt (pymemgpt==0.3.2) I am getting below error, I am not sure why, I have setup all the variables as suggested by you. Can you please suggest? export AZURE_OPENAI_KEY="..." export AZURE_OPENAI_VERSION="2023-07-01-preview" export AZURE_OPENAI_EMBEDDING_VERSION="2023-07-01-preview" Error: File "/Users/rajendra.t/Library/CloudStorage/OneDrive-MaerskGroup/GITRepo/MemGPT/.venv/lib/python3.11/site-packages/memgpt/main.py", line 347, in run_agent_loop |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Hi Team, I am facing issues in configuring Azure LLM with MemGPT, I have two different Endpoints one for embedding model and one for GPT4, when I configure GPT4 I can converse using Memgpt but not use embedding model and when I configure embedding Model I can create embeddings but I can not converse. I am using the following env variables for configuration. Can you please suggest.
https://memgpt.readme.io/docs/endpoints
Beta Was this translation helpful? Give feedback.
All reactions