-
Notifications
You must be signed in to change notification settings - Fork 1
feat: use tutor configs to select model, key and options in litellm #19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thanks for the pull request, @Henrrypg! This repository is currently maintained by Once you've gone through the following steps feel free to tag them in a comment and let them know that your changes are ready for engineering review. 🔘 Get product approvalIf you haven't already, check this list to see if your contribution needs to go through the product review process.
🔘 Provide contextTo help your reviewers and other members of the community understand the purpose and larger context of your changes, feel free to add as much of the following information to the PR description as you can:
🔘 Get a green buildIf one or more checks are failing, continue working on your changes until this is no longer the case and your build turns green. Where can I find more information?If you'd like to get more details on all aspects of the review process for open source pull requests (OSPRs), check out the following resources: When can I expect my changes to be merged?Our goal is to get community contributions seen and reviewed as efficiently as possible. However, the amount of time that it takes to review and merge a PR can vary significantly based on factors such as:
💡 As a result it may take up to several weeks or months to complete a review and merge your PR. |
|
I changed the base so that this PR only reflects the changes meant to use tutor configs as a way of passing the secret key for openai or other llm providers |
| # Each new setting is a pair: (setting_name, default_value). | ||
| ("OPENEDX_AI_EXTENSIONS", [{ | ||
| "default": { | ||
| "API_KEY": "", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can write the default as put_your_api_key_here
| settings.AI_MODEL = 'gpt-4.1-mini' | ||
| settings.OPENAI_API_KEY = "make_it_read_from_tutor" | ||
| if not hasattr(settings, "OPENEDX_AI_EXTENSIONS"): | ||
| settings.OPENEDX_AI_EXTENSIONS = os.getenv("OPENEDX_AI_EXTENSIONS", default={ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can use a deep update of dict here so that incomplete dicts update the default
|
By deleting the branch that was target here I inadvertedly deleted the PR. I'm opening again and moving to track main since the tutor PR was merged |
8e6925d to
ecdda37
Compare
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #19 +/- ##
===========================================
+ Coverage 46.62% 67.90% +21.28%
===========================================
Files 17 16 -1
Lines 311 349 +38
Branches 20 23 +3
===========================================
+ Hits 145 237 +92
+ Misses 166 108 -58
- Partials 0 4 +4
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
@Henrrypg I rebased and updated this PR according to our call. Now the tutor config accepts either one variable with the key or the complete dict for AI_EXTENSIONS. you can use any of this examples. If you use them all at once, AI_EXTENSIONS will override AI_EXTENSIONS_OPENAI_API_KEY or anthropic. # simple version
AI_EXTENSIONS_OPENAI_API_KEY: sk-...
AI_EXTENSIONS_ANTHROPIC_API_KEY: sk-...
# advanced
AI_EXTENSIONS:
anthropic:
API_KEY: sk-....
LITELLM_MODEL: claude-3-haiku-20240307
MAX_TOKENS: 8400
TEMPERATURE: 1
TIMEOUT: 600 |
e4dafbb to
98566c7
Compare
--- adding simple configuration with key only for openai and anthropic and override capacity
98566c7 to
a00ac4d
Compare
863263b to
f9eed65
Compare
f9eed65 to
d82be77
Compare
|
@felipemontoya It's working fine for me, i just add some tests to pass checks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This has been tested by me and Henrry and we also used this code to deploy to the sandbox at:
This pull request introduces several important updates to both the backend and frontend of the
openedx_ai_extensionsproject, focusing on improved configuration management, plugin integration with Tutor, and enhancements to build and packaging processes. The most significant changes include the migration to a centralized configuration object for AI model settings, the addition of Tutor plugin hooks and patch management, and updates to frontend package naming and build scripts.Configuration and Backend Updates:
OPENEDX_AI_EXTENSIONSconfiguration object, which is now read from environment variables and used throughout the backend, replacing individual settings likeAI_MODELandOPENAI_API_KEY. [1] [2] [3]pathlib.Pathfor more robust file handling when readingREADME.mdandCHANGELOG.rstfiles. [1] [2]Tutor Plugin Integration:
plugin.py, including default configuration values, build context management, patch file integration, and UI slot configuration for the learning experience. [1] [2]Frontend and Build Process Improvements:
@openedx/openedx-ai-extensions-uifor consistency with backend naming conventions.Makefileto handle the absence of theLICENSEfile gracefully during the build process.Versioning:
20.0.0in__about__.py.These changes collectively improve the maintainability, deployment, and extensibility of the
openedx_ai_extensionsplugin within the Open edX ecosystem.