Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[13/n] Add output CRUD operations to AIConfig #14

Merged
merged 11 commits into from
Oct 13, 2023
Merged

[13/n] Add output CRUD operations to AIConfig #14

merged 11 commits into from
Oct 13, 2023

Conversation

Ports over parameterization handling for AIConfig. Still some work left around getting outputs properly -- as well as tests, which will come in upcoming diffs
Next will add stubs for ModelParser. Then we can begin implementing
…lass

Still getting library structure set up in TS
Used the executeCellWithDependencies implementation, which should be consistent with this.
Handles both chat and non-chat models (two separate model parsers for those).

Handles parametrization properly across system prompt and previous messages.

Next will add run methods for OpenAI which handle streaming and nonstreaming.
Some minor API design changes to help with returning streaming results back.

Also implemented the run function for OpenAI models, handling both streaming and non-streaming scenarios. Added a callback handler to support sending incremental updates back to the caller.
Add implementation for all CRUD operations.
Next up is tests and some additional APIs for parameterization.
This allows each model parser to determine how to parse an Output object as a string value.
* Add some CRUD operations to be able to interact with AIConfig Output objects more easily
* Also some minor refactorings to consolidate output handling in the AIConfigRuntime class.
);
}

const existingOutputs = [...(prompt.outputs || [])];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit, it seems a bit weird to return the outputs that are deleted; it seems to set an expectation that they can be operated on, but that seems like it could lead to issues

* @param promptName The name of the prompt whose outputs to delete.
* @returns The outputs that were deleted from the prompt.
*/
public deleteOutput(promptName: string) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Singular name makes it sound like it will only delete 1 output (e.g. latest). Should probably be deleteOutputs. Can consider adding a param to specify latest-only

prompt = this.getPrompt(prompt)!;
if (!prompt) {
throw new Error(
`E1029: Cannot delete outputs. Prompt ${prompt} not found in AIConfig.`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be a different error

*/
public getLatestOutput(prompt: string | Prompt): Output | undefined {
if (typeof prompt === "string") {
prompt = this.getPrompt(prompt)!;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit, prefer to use a different variable instead of reassigning the function param

prompt = this.getPrompt(prompt)!;
if (!prompt) {
throw new Error(
`E1030: Cannot delete outputs. Prompt ${prompt} not found in AIConfig.`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

new error

*/
public getModelName(prompt: string | Prompt) {
if (typeof prompt === "string") {
prompt = this.getPrompt(prompt)!;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit, same as above r.e. reassigning

prompt = this.getPrompt(prompt)!;
if (!prompt) {
throw new Error(
`E1031: Cannot delete outputs. Prompt ${prompt} not found in AIConfig.`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

error

return modelParser.getOutputText(prompt, this, output);
}

// TODO: saqadri - log a warning if the model parser isn't parameterized
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can remove this TODO as no longer relevant

@@ -31,7 +31,11 @@ export class ModelParserRegistry {
}

public static getModelParserForPrompt(prompt: Prompt) {
const id = ModelParser.getModelName(prompt);
const id =
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice. This resolves a comment from earlier PR

@saqadri saqadri merged commit be8dd8d into main Oct 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants