Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use gemini models #1486

Open
2 tasks done
ndisalvio3 opened this issue Jun 29, 2024 · 1 comment
Open
2 tasks done

Unable to use gemini models #1486

ndisalvio3 opened this issue Jun 29, 2024 · 1 comment

Comments

@ndisalvio3
Copy link

ndisalvio3 commented Jun 29, 2024

Describe the bug
Upon attempting to use any model other then gemini-pro it throws an error.

Please describe your setup

  • How did you install memgpt?
    • pip install pymemgpt
  • Describe your setup
    • What's your OS (Windows/MacOS/Linux)? WSL
    • How are you running memgpt? (cmd.exe/Powershell/Anaconda Shell/Terminal) Terminal

Screenshots
If applicable, add screenshots to help explain your problem.

Traceback (most recent call last):
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/main.py", line 415, in run_agent_loop
    new_messages, user_message, skip_next_user_input = process_agent_step(user_message, no_verify)
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/main.py", line 384, in process_agent_step
    new_messages, heartbeat_request, function_failed, token_warning, tokens_accumulated = memgpt_agent.step(
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 787, in step
    raise e
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 702, in step
    response = self._get_ai_reply(
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 420, in _get_ai_reply
    raise e
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 395, in _get_ai_reply
    response = create(
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 133, in wrapper
    raise e
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 106, in wrapper
    return func(*args, **kwargs)
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 273, in create
    return google_ai_chat_completions_request(
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/google_ai.py", line 426, in google_ai_chat_completions_request
    assert model in SUPPORTED_MODELS, f"Model '{model}' not in supported models: {', '.join(SUPPORTED_MODELS)}"
AssertionError: Model 'gemini-1.5-pro' not in supported models: gemini-pro```

**Additional context**
I have a paid plan.

**MemGPT Config**
Please attach your `~/.memgpt/config` file or copy past it below.

```[defaults]
preset = memgpt_chat
persona = sam_pov
human = basic

[model]
model = gemini-1.5-pro
model_endpoint_type = google_ai
context_window = 2097152

[embedding]
embedding_endpoint_type = openai
embedding_endpoint = https://api.openai.com/v1
embedding_model = text-embedding-ada-002
embedding_dim = 1536
embedding_chunk_size = 300

[archival_storage]
type = chroma
path = /home/ndisalvio/.memgpt/chroma

[recall_storage]
type = sqlite
path = /home/ndisalvio/.memgpt

[metadata_storage]
type = sqlite
path = /home/ndisalvio/.memgpt

[version]
memgpt_version = 0.3.18

[client]
anon_clientid = 00000000-0000-0000-0000-000000000000```

---
@MisileLab
Copy link

gemini 1.5 pro doesnt supported now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: To triage
Development

No branches or pull requests

2 participants