-
-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow model selection on the frontend #187
base: main
Are you sure you want to change the base?
Conversation
Yeah I like the way it looks, how about both on the settings page and in the chat? |
After our discussion, agreed to delay this for a while. Other models seems to be performing much worse in some ways (even though they are more powerful). So let's wait until we have a regression test suite, then we can enable some of those models after testing them thoroughly. |
I think once we tackle this again, it makes sense to refactor it in a generic way so as to support different LLMs + local LLMs potentially. Code for this is pretty unorganized and very specific to open AI right now (settings and validation, selecting the LLM in code, etc.) |
waiting for your feedback update for above |
Yeah this is postponed for a bit until we finish the evaluation pipeline. Once we have that, we'll be able to evaluate models quickly and decide whether or not we want to enable them in DataLine. Our goal isn't to just "support" lots of models, our goal is to allow users to pick among high quality options that all work. There is a minimum bar we want to maintain here, otherwise someone might come and try DataLine with GPT4o and decide that DataLine is bad (meanwhile the model itself is the problem). It's in our collective best interest to prevent that from happening 🙂 |
Issue #, if available:
#171
Description of changes:
Add listbox on the settings page to allow choosing preferred model
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.