Releases: Portkey-AI/portkey-python-sdk
Move pydantic dependency from equals to gt version number
What's Changed
- Move pydantic from equals to gt version number by @ayush-portkey in #63
Full Changelog: v1.0.1...v1.1.0
Rename prompt to prompts
What's Changed
- Rename prompt to prompts. by @ayush-portkey in #62
New Contributors
- @ayush-portkey made their first contribution in #62
Full Changelog: v1.0.0...v1.0.1
v1.0.0: New updates on SDK
Summary
We're pushing out some exciting new updates to Portkey's SDKs, APIs, and Configs.
Portkey's SDKs are upped to major version 1.0 bringing parity with the new OpenAI SDK structure and adding Portkey production features to it. We are also bringing native Langchain & Llamaindex integrations inside the SDK.
NOTE : This is a Breaking Change that Requires Migration.
Here's What's New:
- More extensible SDK that can be used with many more LLM providers
- Out-of-the-box support for streaming
- Completely follows OpenAI's SDK signature reducing your technical debt
- Native support for Langchain & Llamaindex within the SDK
- Support for the Portkey Feedback endpoint
- Support for Portkey Prompt Templates
- Older SDK versions to be deprecated soon
Configs 2.0
Here's What's New
- New concept of strategy instead of standalone mode. You can now build bespoke gateway strategies and nest them in a single config.
- You can also trigger a specific strategy on specific error codes.
- New concept of targets that replace options in the previous Config
- If you are adding virtual_key to the target array, you no longer need to add provider,Portkey will pick up the Provider directly from the Virtual Key!
- For Azure, only now pass the virtual_key - it takes care of all other Azure params like Deployment name, API version etc.
Check out the complete portkeys-december-migration for the complete list of changes usage details.
v0.1.53
What's Changed
- Add load balance example by @vrushankportkey in #28
- feat: Added support for config-slug in the apis by @noble-varghese in #35
- feat: Version upgrade - 0.1.53 by @noble-varghese in #36
Full Changelog: v0.1.52...v0.1.53
v0.1.52
What's Changed
- feat: Adding Integration Testing Suite for LLM Integrations by @noble-varghese in #23
- Pass prompt at completion call by @vrushankportkey in #24
- feat: anyscale integration & openschema for all integrations by @noble-varghese in #25
- feat: Version upgrade - 0.1.52 by @noble-varghese in #26
Full Changelog: v0.1.51...v0.1.52
v0.1.51
What's Changed
- fix: Issue with the str comparison on the api path by @noble-varghese in #22
Full Changelog: v0.1.50...v0.1.51
v0.1.50
What's Changed
- Community Content Added by @vrushankportkey in #14
- Add README by @vrushankportkey in #17
- Update Readme by @vrushankportkey in #18
- feat: introducing a "Generate" API for prompt-based content generation with saved prompts by @noble-varghese in #19
- feat: accepting variables in the generations API by @noble-varghese in #20
- feat: version upgrade to 0.1.50 by @noble-varghese in #21
Full Changelog: v0.1.49...v0.1.50
v0.1.49(Stable)
What's Changed
- fix: Fixing the issue with the capturing of env variables by @noble-varghese in #13
Full Changelog: v0.1.48...v0.1.49
v0.1.48
What's Changed
- fix: Fixed issue with enum check by @noble-varghese in #10
- fix: cache age type fix by @noble-varghese in #11
Full Changelog: v0.1.45...v0.1.48
v0.1.45
What's Changed
- docs: Added examples for azure-openai fallback and loadbalance by @noble-varghese in #8
- feat: Add support for azure models by @noble-varghese in #9
Full Changelog: v0.1.44...v0.1.45