You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Upon attempting to use any model other then gemini-pro it throws an error.
Please describe your setup
How did you install memgpt?
pip install pymemgpt
Describe your setup
What's your OS (Windows/MacOS/Linux)? WSL
How are you running memgpt? (cmd.exe/Powershell/Anaconda Shell/Terminal) Terminal
Screenshots
If applicable, add screenshots to help explain your problem.
Traceback (most recent call last):
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/main.py", line 415, in run_agent_loop
new_messages, user_message, skip_next_user_input = process_agent_step(user_message, no_verify)
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/main.py", line 384, in process_agent_step
new_messages, heartbeat_request, function_failed, token_warning, tokens_accumulated = memgpt_agent.step(
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 787, in step
raise e
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 702, in step
response = self._get_ai_reply(
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 420, in _get_ai_reply
raise e
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 395, in _get_ai_reply
response = create(
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 133, in wrapper
raise e
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 106, in wrapper
return func(*args, **kwargs)
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 273, in create
return google_ai_chat_completions_request(
File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/google_ai.py", line 426, in google_ai_chat_completions_request
assert model in SUPPORTED_MODELS, f"Model '{model}' not in supported models: {', '.join(SUPPORTED_MODELS)}"
AssertionError: Model 'gemini-1.5-pro' not in supported models: gemini-pro```
**Additional context**
I have a paid plan.
**MemGPT Config**
Please attach your `~/.memgpt/config` file or copy past it below.
```[defaults]
preset = memgpt_chat
persona = sam_pov
human = basic
[model]
model = gemini-1.5-pro
model_endpoint_type = google_ai
context_window = 2097152
[embedding]
embedding_endpoint_type = openai
embedding_endpoint = https://api.openai.com/v1
embedding_model = text-embedding-ada-002
embedding_dim = 1536
embedding_chunk_size = 300
[archival_storage]
type = chroma
path = /home/ndisalvio/.memgpt/chroma
[recall_storage]
type = sqlite
path = /home/ndisalvio/.memgpt
[metadata_storage]
type = sqlite
path = /home/ndisalvio/.memgpt
[version]
memgpt_version = 0.3.18
[client]
anon_clientid = 00000000-0000-0000-0000-000000000000```
---
The text was updated successfully, but these errors were encountered:
Describe the bug
Upon attempting to use any model other then gemini-pro it throws an error.
Please describe your setup
pip install pymemgpt
memgpt
? (cmd.exe
/Powershell/Anaconda Shell/Terminal) TerminalScreenshots
If applicable, add screenshots to help explain your problem.
The text was updated successfully, but these errors were encountered: