-
-
Notifications
You must be signed in to change notification settings - Fork 4.9k
Closed as not planned
Closed as not planned
Copy link
Labels
Description
What happened?
Getting stacktrace during app start
Config:
model_list:
- model_name: gpt-3.5-turbo
litellm_params:
model: gpt-3.5-turbo
api_key: sk-E0BrSPzzi06mfRqy-------9bO
rpm: 200
- model_name: gpt-3.5-turbo-16k
litellm_params:
model: gpt-3.5-turbo-16k
api_key: sk-E0BrSPzzi06mfRqy-------9bO
rpm: 100
general_settings:
litellm_master_key: sk-aztest
ui_username: litellm
ui_password: litellm
litellm_settings:
set_verbose: True
cache: True
cache_params:
type: "redis-semantic"
host: "localhost"
port: "6379"
password: "password"
similarity_threshold: 0.8
redis_semantic_cache_use_async: True
redis_semantic_cache_embedding_model: "text-embedding-ada-002"
Relevant log output
❯ litellm --config config.yaml
INFO: Started server process [42803]
INFO: Waiting for application startup.
#------------------------------------------------------------#
# #
# 'It would help me if you could add...' #
# https://github.com/BerriAI/litellm/issues/new #
# #
#------------------------------------------------------------#
Thank you for using LiteLLM! - Krrish & Ishaan
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
Setting Cache on Proxy
redis semantic-cache initializing INDEX - litellm_semantic_cache_index
redis semantic-cache redis_url: redis://:password@localhost:6379
ERROR: Traceback (most recent call last):
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/starlette/routing.py", line 610, in __aenter__
await self._router.startup()
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/starlette/routing.py", line 711, in startup
await handler()
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/litellm/proxy/proxy_server.py", line 3192, in startup_event
await initialize(**worker_config)
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/litellm/proxy/proxy_server.py", line 3036, in initialize
) = await proxy_config.load_config(router=llm_router, config_file_path=config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/litellm/proxy/proxy_server.py", line 2060, in load_config
self._init_cache(cache_params=cache_params)
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/litellm/proxy/proxy_server.py", line 1961, in _init_cache
litellm.cache = Cache(**cache_params)
^^^^^^^^^^^^^^^^^^^^^
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/litellm/caching.py", line 1490, in __init__
self.cache = RedisSemanticCache(
^^^^^^^^^^^^^^^^^^^
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/litellm/caching.py", line 776, in __init__
self.index = SearchIndex.from_dict(schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/redisvl/index/index.py", line 266, in from_dict
schema = IndexSchema.from_dict(schema_dict)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/redisvl/schema/schema.py", line 280, in from_dict
return cls(**data)
^^^^^^^^^^^
File "/Users/mda/.pyenv/versions/3.12.2/lib/python3.12/site-packages/pydantic/v1/main.py", line 341, in __init__
raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for IndexSchema
__root__
New schema format introduced; please update schema spec. (type=value_error)
ERROR: Application startup failed. Exiting.Twitter / LinkedIn details
No response