Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace RUN_PROMPT with STREAM_AICONFIG_CHUNK inside of streaming implementation #919

Merged
merged 3 commits into from
Jan 15, 2024

Conversation

rossdanlm
Copy link
Contributor

@rossdanlm rossdanlm commented Jan 14, 2024

Rossdan Craig [email protected] added 3 commits January 14, 2024 19:11
I was having some trouble converting everything to `STREAM_AICONFIG_CHUNK` so I'm just doing really easy steps to keep diffs easy to review and test. You'll see final code changes at end of diff stack.

## This diff
No functional changes, but I want to use the `CONSOLIDATE_AICONFIG` event to pass in nested action `STREAM_AICONFIG_CHUNK` action later in stack, without requiring me to pass it in directly to the `aiconfigReducer`. By separating these, they can be distinct types and easier to manage
I was having some trouble converting everything to `STREAM_AICONFIG_CHUNK` so I'm just doing really easy steps to keep diffs easy to review and test. You'll be able to see final result at end of diff stack

## This diff
Just doing very simple name coy and replace, no functional changes
…lementation

This is the exact same logic, but I'm just going to do this as intermediate step to keep things clear and easy to review

## Test Plan
No functional changes, streaming and non streaming models still work
Copy link
Contributor

@rholinshead rholinshead left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left some comments but I think this is all just a copy/paste and next PR changes it to what my comment was suggesting

type: "STREAM_AICONFIG_CHUNK";
id: string;
cancellationToken?: string;
isRunning?: boolean;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With this new action, we should be able to infer isRunning in the reducer now instead of passing it through the action, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes correct, this diff is all just copy pasting to keep it easier to review and see the steps I did!

Comment on lines +242 to +248
// Note: If we are calling "RUN_PROMPT" directly as a dispatched event
// type, we automatically set the state there to `isRunning` for that
// prompt. That logic does not happen here, it happens in
// `aiconfigReducer`.
// If we are calling "RUN_PROMPT" indirectly via the action of a
// "CONSOLIDATE_AICONFIG" dispatch, we end up here. We need to check
// if we actually want to set the prompt state to `isRunning`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just from copy/paste?

rossdanlm added a commit that referenced this pull request Jan 15, 2024
…917)

Refactor ConsolidateAIConfig to only take in specificed sub-actions

I was having some trouble converting everything to
`STREAM_AICONFIG_CHUNK` so I'm just doing really easy steps to keep
diffs easy to review and test. You'll see final code changes at end of
diff stack.

## This diff
No functional changes, but I want to use the `CONSOLIDATE_AICONFIG`
event to pass in nested action `STREAM_AICONFIG_CHUNK` action later in
stack, without requiring me to pass it in directly to the
`aiconfigReducer`. By separating these, they can be distinct types and
easier to manage

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/917).
* #928
* #926
* #925
* #924
* #922
* #921
* #920
* #919
* #918
* __->__ #917
rossdanlm added a commit that referenced this pull request Jan 15, 2024
…918)

[ez] Rename `aiconfig` stream response to `aiconfig_chunk` response


I was having some trouble converting everything to
`STREAM_AICONFIG_CHUNK` so I'm just doing really easy steps to keep
diffs easy to review and test. You'll be able to see final result at end
of diff stack

## This diff
Just doing very simple name coy and replace, no functional changes

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/918).
* #928
* #926
* #925
* #924
* #922
* #921
* #920
* #919
* __->__ #918
* #917
@rossdanlm rossdanlm merged commit 7ffccee into main Jan 15, 2024
2 checks passed
@rossdanlm rossdanlm deleted the pr919 branch January 15, 2024 20:12
rossdanlm added a commit that referenced this pull request Jan 15, 2024
Implement STREAM_AICONFIG_CHUNK action



This is the biggest part of this diff stack. The functionality is still
exactly the same as before, but now we can call it directly instead of
as a sub-action within `"CONSOLIDATE_AICONFIG"`. I think this is much
cleaner.

Another benefit is this removes the need for having to check the
`isRunning` flags within the consolidated `"RUN_PROMPT"` sub-action,
since we now know that the only time we can call that is through the
beginning and the end (non-streaming end). Next PR I am going to split
this into two separate Actions:
1. `"RUN_PROMPT_START"`
2. `"RUN_PROMPT_SUCCESS"`


## Test Plan
Streaming and non-streaming models still work. Can cancel, it'll end,
etc


https://github.com/lastmile-ai/aiconfig/assets/151060367/cf29ee99-9ede-4c5e-99a7-a7f7816adfe1

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/920).
* #928
* #926
* #925
* #924
* #922
* #921
* __->__ #920
* #919
* #918
* #917
rossdanlm added a commit that referenced this pull request Jan 15, 2024
Minor nits to cancellation error message


Put the prompt name in quotes. Also specifying that we're resetting
outputs only, not the entire prompt or AIConfig. Also refreshed the
AIConfig to make sure that changes are propagated.

I also noticed that if you don't refresh the text lingers there. This
gets fixed automatically with the next PR in diff stack!

## Test Plan

Before

<img width="447" alt="Screenshot 2024-01-14 at 03 00 22"
src="https://github.com/lastmile-ai/aiconfig/assets/151060367/6dc5b73b-507d-4808-ad9f-d934a0597420">


After

<img width="442" alt="Screenshot 2024-01-14 at 03 00 05"
src="https://github.com/lastmile-ai/aiconfig/assets/151060367/7f32d2d3-e863-4e84-b42f-c9587b93e3c2">

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/921).
* #928
* #926
* #925
* #924
* #922
* __->__ #921
* #920
* #919
* #918
* #917
rossdanlm added a commit that referenced this pull request Jan 15, 2024
Create new action RUN_PROMPT_CANCEL

Splitting the RUN_PROMPT action into 3 sub-categories:

1. RunPrompt
2. RunPromptCancel
3. RunPromptSuccess
4. RunPromptError

Having 4 separate events means that we can call these directly without
having to use RUN_PROMPT within the CONSOLIDATE_AICONFIG

Another benefit of this is that it fixes a bug where the state doesn't
get properly refreshed, so the output is shown with the existing
cancelled output and not the previous one

## Test Plan
Before (the output state lingers instead of being cleared --> If this is
a feature not a bug, that's easy to keep just don't reset the client
prompt output to what the server responds back with)


https://github.com/lastmile-ai/aiconfig/assets/151060367/b638bf05-60d5-4460-ad76-3c968366ced6


After
Everything works as previously (can update input + metadata on client
and that remains unchanged, only output gets restored)


https://github.com/lastmile-ai/aiconfig/assets/151060367/a2fd61ec-08bf-4bd4-8766-97b01bc84c8e

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/922).
* #928
* #926
* #925
* #924
* __->__ #922
* #921
* #920
* #919
* #918
* #917
rossdanlm added a commit that referenced this pull request Jan 15, 2024
Create new action RUN_PROMPT_SUCCESS


The real main reason I'm doing this is as an intermediate step to show
that I can remove RunPromptAction from `ConsolidateAIConfigSubAction` so
that now we only need to define "RUN_PROMPT" action one time.

Next PR I'm going to merge "STOP_STREAMING", "RUN_PROMPT_CANCEL" and
"RUN_PROMPT_SUCCESS" into a single action called "RUN_PROMPT_COMPLETE"
because all the logic they share is the same. I'm just doing this slowly
instead of all at once to make things easier to review.


## Test Plan
Both streaming and non-streaming still runs as before, just like the
rest of diffs in this stack

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/924).
* #928
* #926
* #925
* __->__ #924
* #922
* #921
* #920
* #919
* #918
* #917
rossdanlm added a commit that referenced this pull request Jan 15, 2024
Rename RUN_PROMPT to RUN_PROMPT_START

Keeping diff small and easy to review. With last diff, we removed out
the ambiguity of when `RUN_PROMPT` is called, and now it's only ever
called at the beginning of run prompt flow

## Test Plan
Same as before (exact same videos), simple re-naming, no functional
changes

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/925).
* #928
* #926
* __->__ #925
* #924
* #922
* #921
* #920
* #919
* #918
* #917
rossdanlm added a commit that referenced this pull request Jan 15, 2024
Split Action Types and AIConfigReducer into it's own folder

I just feel having the action type definitions vs. the reducer
implementation make this a lot easier to read. You can now have file
side by side instead of having to jump back and forth across the same
one. I also merged the run commands together under the `RunPromptAction`
type to make things cleaner as well. This will make it easier next diff
once I merge logic together for the run prompts

I also cleaned up the `UPDATE_PROMPT_INPUT` implementation inside of the
aiConfigReducer because it wasn't passing in the correct input type
anyways, so now that's cleaner too.

## Test Plan
Everything works the same as before


https://github.com/lastmile-ai/aiconfig/assets/151060367/46c11941-c055-4307-99b2-3cd6202b78cb

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/926).
* #928
* __->__ #926
* #925
* #924
* #922
* #921
* #920
* #919
* #918
* #917
rossdanlm added a commit that referenced this pull request Jan 15, 2024
Merge STOP_STREAMING event into RUN_PROMPT_SUCCESS

I just feel it makes a bit of sense to merge these two events together.
I think this will be the final refacotring change I do

I also:

1. Changed the `id` fields to `promptId` to be more explicit
2. Centralized logic for setting the ClientAIConfig into a
`setRunningPromptId` function
3. Re-arranged the action cases in the aiconfigReducer switch to have
the run actions together, as well as save success at the end

## Test Plan
Everything still works as usual for streaming and non-streaming. Same
video as the PRs earlier in this stack

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/928).
* __->__ #928
* #926
* #925
* #924
* #922
* #921
* #920
* #919
* #918
* #917
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants