-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[14/n] Fixes for model name #15
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Ports over parameterization handling for AIConfig. Still some work left around getting outputs properly -- as well as tests, which will come in upcoming diffs
Next will add stubs for ModelParser. Then we can begin implementing
…lass Still getting library structure set up in TS
Used the executeCellWithDependencies implementation, which should be consistent with this.
Handles both chat and non-chat models (two separate model parsers for those). Handles parametrization properly across system prompt and previous messages. Next will add run methods for OpenAI which handle streaming and nonstreaming.
Some minor API design changes to help with returning streaming results back. Also implemented the run function for OpenAI models, handling both streaming and non-streaming scenarios. Added a callback handler to support sending incremental updates back to the caller.
Add implementation for all CRUD operations. Next up is tests and some additional APIs for parameterization.
Still WIP
This allows each model parser to determine how to parse an Output object as a string value.
* Add some CRUD operations to be able to interact with AIConfig Output objects more easily * Also some minor refactorings to consolidate output handling in the AIConfigRuntime class.
When serializing completion params into a Prompt object, we need to use the model name specified in the completion params since that affects the model behavior. That should be the ID.
This was referenced Oct 9, 2023
rholinshead
approved these changes
Oct 13, 2023
This was referenced Oct 13, 2023
Merged
This was referenced Oct 13, 2023
Merged
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
[14/n] Fixes for model name
When serializing completion params into a Prompt object, we need to use the model name specified in the completion params since that affects the model behavior. That should be the ID.
Stack created with Sapling. Best reviewed with ReviewStack.