-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement STREAM_AICONFIG_CHUNK action #920
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This was referenced Jan 14, 2024
rossdanlm
force-pushed
the
pr920
branch
2 times, most recently
from
January 14, 2024 07:55
730a4c8
to
fbfddb6
Compare
This was referenced Jan 14, 2024
rossdanlm
requested review from
saqadri,
rholinshead,
suyoglastmileai,
Ankush-lastmile and
jonathanlastmileai
as code owners
January 14, 2024 08:13
This was referenced Jan 14, 2024
I was having some trouble converting everything to `STREAM_AICONFIG_CHUNK` so I'm just doing really easy steps to keep diffs easy to review and test. You'll see final code changes at end of diff stack. ## This diff No functional changes, but I want to use the `CONSOLIDATE_AICONFIG` event to pass in nested action `STREAM_AICONFIG_CHUNK` action later in stack, without requiring me to pass it in directly to the `aiconfigReducer`. By separating these, they can be distinct types and easier to manage
I was having some trouble converting everything to `STREAM_AICONFIG_CHUNK` so I'm just doing really easy steps to keep diffs easy to review and test. You'll be able to see final result at end of diff stack ## This diff Just doing very simple name coy and replace, no functional changes
…lementation This is the exact same logic, but I'm just going to do this as intermediate step to keep things clear and easy to review ## Test Plan No functional changes, streaming and non streaming models still work
This is the biggest part of this diff stack. The functionality is still exactly the same as before, but now we can call it directly instead of as a sub-action within `"CONSOLIDATE_AICONFIG"`. I think this is much cleaner. Another benefit is this removes the need for having to check the `isRunning` flags within the consolidated `"RUN_PROMPT"` sub-action, since we now know that the only time we can call that is through the beginning and the end (non-streaming end). Next PR I am going to split this into two separate Actions: 1. `"RUN_PROMPT_START"` 2. `"RUN_PROMPT_SUCCESS"` ## Test Plan Streaming and non-streaming models still work. Can cancel, it'll end, etc https://github.com/lastmile-ai/aiconfig/assets/151060367/cf29ee99-9ede-4c5e-99a7-a7f7816adfe1
rholinshead
approved these changes
Jan 15, 2024
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
…917) Refactor ConsolidateAIConfig to only take in specificed sub-actions I was having some trouble converting everything to `STREAM_AICONFIG_CHUNK` so I'm just doing really easy steps to keep diffs easy to review and test. You'll see final code changes at end of diff stack. ## This diff No functional changes, but I want to use the `CONSOLIDATE_AICONFIG` event to pass in nested action `STREAM_AICONFIG_CHUNK` action later in stack, without requiring me to pass it in directly to the `aiconfigReducer`. By separating these, they can be distinct types and easier to manage --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/917). * #928 * #926 * #925 * #924 * #922 * #921 * #920 * #919 * #918 * __->__ #917
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
…918) [ez] Rename `aiconfig` stream response to `aiconfig_chunk` response I was having some trouble converting everything to `STREAM_AICONFIG_CHUNK` so I'm just doing really easy steps to keep diffs easy to review and test. You'll be able to see final result at end of diff stack ## This diff Just doing very simple name coy and replace, no functional changes --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/918). * #928 * #926 * #925 * #924 * #922 * #921 * #920 * #919 * __->__ #918 * #917
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
…lementation (#919) Replace RUN_PROMPT with STREAM_AICONFIG_CHUNK inside of streaming implementation This is the exact same logic, but I'm just going to do this as intermediate step to keep things clear and easy to review ## Test Plan No functional changes, streaming and non streaming models still work --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/919). * #928 * #926 * #925 * #924 * #922 * #921 * #920 * __->__ #919 * #918 * #917
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
Minor nits to cancellation error message Put the prompt name in quotes. Also specifying that we're resetting outputs only, not the entire prompt or AIConfig. Also refreshed the AIConfig to make sure that changes are propagated. I also noticed that if you don't refresh the text lingers there. This gets fixed automatically with the next PR in diff stack! ## Test Plan Before <img width="447" alt="Screenshot 2024-01-14 at 03 00 22" src="https://github.com/lastmile-ai/aiconfig/assets/151060367/6dc5b73b-507d-4808-ad9f-d934a0597420"> After <img width="442" alt="Screenshot 2024-01-14 at 03 00 05" src="https://github.com/lastmile-ai/aiconfig/assets/151060367/7f32d2d3-e863-4e84-b42f-c9587b93e3c2"> --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/921). * #928 * #926 * #925 * #924 * #922 * __->__ #921 * #920 * #919 * #918 * #917
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
Create new action RUN_PROMPT_CANCEL Splitting the RUN_PROMPT action into 3 sub-categories: 1. RunPrompt 2. RunPromptCancel 3. RunPromptSuccess 4. RunPromptError Having 4 separate events means that we can call these directly without having to use RUN_PROMPT within the CONSOLIDATE_AICONFIG Another benefit of this is that it fixes a bug where the state doesn't get properly refreshed, so the output is shown with the existing cancelled output and not the previous one ## Test Plan Before (the output state lingers instead of being cleared --> If this is a feature not a bug, that's easy to keep just don't reset the client prompt output to what the server responds back with) https://github.com/lastmile-ai/aiconfig/assets/151060367/b638bf05-60d5-4460-ad76-3c968366ced6 After Everything works as previously (can update input + metadata on client and that remains unchanged, only output gets restored) https://github.com/lastmile-ai/aiconfig/assets/151060367/a2fd61ec-08bf-4bd4-8766-97b01bc84c8e --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/922). * #928 * #926 * #925 * #924 * __->__ #922 * #921 * #920 * #919 * #918 * #917
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
Create new action RUN_PROMPT_SUCCESS The real main reason I'm doing this is as an intermediate step to show that I can remove RunPromptAction from `ConsolidateAIConfigSubAction` so that now we only need to define "RUN_PROMPT" action one time. Next PR I'm going to merge "STOP_STREAMING", "RUN_PROMPT_CANCEL" and "RUN_PROMPT_SUCCESS" into a single action called "RUN_PROMPT_COMPLETE" because all the logic they share is the same. I'm just doing this slowly instead of all at once to make things easier to review. ## Test Plan Both streaming and non-streaming still runs as before, just like the rest of diffs in this stack --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/924). * #928 * #926 * #925 * __->__ #924 * #922 * #921 * #920 * #919 * #918 * #917
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
Rename RUN_PROMPT to RUN_PROMPT_START Keeping diff small and easy to review. With last diff, we removed out the ambiguity of when `RUN_PROMPT` is called, and now it's only ever called at the beginning of run prompt flow ## Test Plan Same as before (exact same videos), simple re-naming, no functional changes --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/925). * #928 * #926 * __->__ #925 * #924 * #922 * #921 * #920 * #919 * #918 * #917
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
Split Action Types and AIConfigReducer into it's own folder I just feel having the action type definitions vs. the reducer implementation make this a lot easier to read. You can now have file side by side instead of having to jump back and forth across the same one. I also merged the run commands together under the `RunPromptAction` type to make things cleaner as well. This will make it easier next diff once I merge logic together for the run prompts I also cleaned up the `UPDATE_PROMPT_INPUT` implementation inside of the aiConfigReducer because it wasn't passing in the correct input type anyways, so now that's cleaner too. ## Test Plan Everything works the same as before https://github.com/lastmile-ai/aiconfig/assets/151060367/46c11941-c055-4307-99b2-3cd6202b78cb --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/926). * #928 * __->__ #926 * #925 * #924 * #922 * #921 * #920 * #919 * #918 * #917
rossdanlm
added a commit
that referenced
this pull request
Jan 15, 2024
Merge STOP_STREAMING event into RUN_PROMPT_SUCCESS I just feel it makes a bit of sense to merge these two events together. I think this will be the final refacotring change I do I also: 1. Changed the `id` fields to `promptId` to be more explicit 2. Centralized logic for setting the ClientAIConfig into a `setRunningPromptId` function 3. Re-arranged the action cases in the aiconfigReducer switch to have the run actions together, as well as save success at the end ## Test Plan Everything still works as usual for streaming and non-streaming. Same video as the PRs earlier in this stack --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/928). * __->__ #928 * #926 * #925 * #924 * #922 * #921 * #920 * #919 * #918 * #917
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Implement STREAM_AICONFIG_CHUNK action
This is the biggest part of this diff stack. The functionality is still exactly the same as before, but now we can call it directly instead of as a sub-action within
"CONSOLIDATE_AICONFIG"
. I think this is much cleaner.Another benefit is this removes the need for having to check the
isRunning
flags within the consolidated"RUN_PROMPT"
sub-action, since we now know that the only time we can call that is through the beginning and the end (non-streaming end). Next PR I am going to split this into two separate Actions:"RUN_PROMPT_START"
"RUN_PROMPT_SUCCESS"
Test Plan
Streaming and non-streaming models still work. Can cancel, it'll end, etc
d7aba0de-eb52-4e20-8a43-267eb644bb0d.mp4
Stack created with Sapling. Best reviewed with ReviewStack.
aiconfig
stream response toaiconfig_chunk
response #918