-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Issues: BerriAI/litellm
[Feature]:
aiohttp
migration - 10-100x Higher RPS Master ti...
#7544
opened Jan 4, 2025 by
ishaan-jaff
Open
3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug]: Changing Organization name in UI fails
bug
Something isn't working
#8677
opened Feb 20, 2025 by
bopowers9
[Bug]: Duplicated Content in Fields Something isn't working
text
and content
Using Together AI on Streaming Response
bug
#8675
opened Feb 20, 2025 by
josperrod9
[Bug]: Empty 'contents' field in Gemini 1.5 Flash requests with CrewAI integration
bug
Something isn't working
#8673
opened Feb 20, 2025 by
nickprock
Increased around 40ms ASR Latency at P50 After Integrating with LiteLLM
#8671
opened Feb 20, 2025 by
yin250
[Feature]: Add JWT Claims to CustomOpenID object
enhancement
New feature or request
#8663
opened Feb 20, 2025 by
ncecere
[Bug]: JWT access with Groups not working when team is assigned All Proxy Models access
bug
Something isn't working
mlops user request
#8662
opened Feb 19, 2025 by
ma-armenta
[Feature]: Allow custom embeddings models via CustomLLM
enhancement
New feature or request
#8660
opened Feb 19, 2025 by
mattmalcher
[Feature]: Add Support for Requesty.ai in LiteLLM
enhancement
New feature or request
#8659
opened Feb 19, 2025 by
ViezeVingertjes
[Feature]: Support for 'Retrieve run' and 'Submit tool outputs to run' which are part of OpenAI's Assistant APIs Runs Feature
enhancement
New feature or request
mlops user request
#8658
opened Feb 19, 2025 by
Nikhil101
[Bug]: No module named tzdata
bug
Something isn't working
mlops user request
#8657
opened Feb 19, 2025 by
bharath-muthineni
[Bug]: output_cost_per_image for vllm
bug
Something isn't working
#8656
opened Feb 19, 2025 by
Mte90
[Bug]: Reasoning Content as default is not expected.
bug
Something isn't working
#8653
opened Feb 19, 2025 by
tonysy
[Bug]: Stream Timeout doesn't work for Bedrock models
bedrock
bug
Something isn't working
feb 2025
#8652
opened Feb 19, 2025 by
jonas-lyrebird-health
[Bug]: Helm chart ignores PROXY_MASTER_KEY environment variable and always generates its own
bug
Something isn't working
#8650
opened Feb 19, 2025 by
hnykda
[Bug]: Migration job in the helm chart has inconsistent DB specification
bug
Something isn't working
#8649
opened Feb 19, 2025 by
hnykda
[Feature]: Improving Retry Mechanism Consistency and Logging for Streamed Responses in LiteLLM Proxy
enhancement
New feature or request
#8648
opened Feb 19, 2025 by
fengjiajie
[Feature]: more conventional + configurable python logging
enhancement
New feature or request
mlops user request
#8641
opened Feb 19, 2025 by
NorthIsUp
[Bug]: [Nit] max_completion_tokens not supported on Azure when api_version is not specified
bug
Something isn't working
#8638
opened Feb 19, 2025 by
enyst
[Feature]: support adding litellm_metadata to gemini passthrough
enhancement
New feature or request
#8634
opened Feb 18, 2025 by
trashhalo
[Bug]: gemini fallback not working
bug
Something isn't working
#8632
opened Feb 18, 2025 by
clarity99
[Bug]: Reasoning with OpenRouter is not available while streaming the completion
bug
Something isn't working
#8631
opened Feb 18, 2025 by
maykcaldas
[Bug]: async_post_call_streaming_hook
bug
Something isn't working
#8628
opened Feb 18, 2025 by
tony-tvu
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.
#8621
opened Feb 18, 2025 by
VamshikrishnaAluwala
[Bug]: Memory Leak in Something isn't working
feb 2025
completion()
with stream=True
bug
#8620
opened Feb 18, 2025 by
iwamot
Previous Next
ProTip!
Adding no:label will show everything without a label.